CN108762602B - Image display method and terminal equipment - Google Patents

Image display method and terminal equipment Download PDF

Info

Publication number
CN108762602B
CN108762602B CN201810285989.3A CN201810285989A CN108762602B CN 108762602 B CN108762602 B CN 108762602B CN 201810285989 A CN201810285989 A CN 201810285989A CN 108762602 B CN108762602 B CN 108762602B
Authority
CN
China
Prior art keywords
width value
image
target
actual
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810285989.3A
Other languages
Chinese (zh)
Other versions
CN108762602A (en
Inventor
璐句腹
贾丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810285989.3A priority Critical patent/CN108762602B/en
Publication of CN108762602A publication Critical patent/CN108762602A/en
Application granted granted Critical
Publication of CN108762602B publication Critical patent/CN108762602B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space

Abstract

The embodiment of the invention discloses an image display method and terminal equipment, relates to the technical field of terminals, and can solve the problem that when a user parks a vehicle in an empty parking space, the vehicle is scratched and rubbed by other obstacles beside the empty parking space. The specific scheme is as follows: under the condition that a first image is displayed on a current interface of the terminal equipment, acquiring a first width value, wherein the first image comprises an image of at least one reference object and an image of a target vacancy, and the first width value is the width value of the image of the first reference object displayed on the current interface; acquiring a first actual width value of a first reference object and a second actual width value of an object to be placed; determining a second width value according to the first width value, the first actual width value and the second actual width value, wherein the second width value is the width value of a target image of an object to be placed to be displayed on a current interface; and displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value.

Description

Image display method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of terminals, in particular to an image display method and terminal equipment.
Background
In life, when a user parks a vehicle in an empty space, the user usually determines whether the empty space can hold the vehicle of the user through experience or a plurality of trial and error modes due to the size limitation of the space.
However, when a user determines whether a vehicle in an empty parking space can be put in the empty parking space, a determination error may occur, which may cause a problem that the vehicle and other obstacles beside the empty parking space are scratched when the user parks the vehicle in the empty parking space.
Disclosure of Invention
The embodiment of the invention provides an image display method and terminal equipment, which can solve the problem that when a user parks a vehicle in an empty parking space, the vehicle of the user is scratched and rubbed by other obstacles beside the empty parking space due to judgment error when the user judges whether the vehicle of the user can be placed in the empty parking space.
In order to solve the technical problem, the embodiment of the invention adopts the following technical scheme:
in a first aspect of the embodiments of the present invention, there is provided an image display method, including: under the condition that a first image is displayed on a current interface of the terminal equipment, acquiring a first width value, wherein the first image comprises an image of at least one reference object and an image of a target vacancy, the first width value is the width value of the image of the first reference object displayed on the current interface, and the first reference object is one of the at least one reference object; acquiring a first actual width value of a first reference object and a second actual width value of an object to be placed; determining a second width value according to the first width value, the first actual width value and the second actual width value, wherein the second width value is the width value of a target image of an object to be placed to be displayed on a current interface; and displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value.
In a second aspect of the embodiments of the present invention, there is provided a terminal device, including: the device comprises an acquisition unit, a determination unit and a display unit. The acquiring unit is used for acquiring a first width value under the condition that a first image is displayed on a current interface of the terminal equipment, wherein the first image comprises an image of at least one reference object and an image of the target vacancy, the first width value is the width value of the image of the first reference object displayed on the current interface, and the first reference object is one of the at least one reference object. The acquisition unit is further used for acquiring a first actual width value of the first reference object and a second actual width value of the object to be placed. And the determining unit is used for determining a second width value according to the first width value, the first actual width value and the second actual width value which are acquired by the acquiring unit, wherein the second width value is the width value of a target image of the object to be placed, which is to be displayed on the current interface. And the display unit is used for displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value determined by the determination unit.
In a third aspect of the embodiments of the present invention, a terminal device is provided, where the terminal device includes a processor, a memory, and a computer program stored in the memory and being executable on the processor, and the computer program, when executed by the processor, implements the steps of the method for displaying an image according to the first aspect.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the method for image display according to the first aspect.
The embodiment of the invention provides an image display method, and terminal equipment can determine the width value of a target image of an object to be placed to be displayed on a current interface and display the target image of the object to be placed at the position of an image of a target vacancy according to a second width value. Because the second width value is determined by the terminal device according to the functional relationship among the first width value of the image of the first reference object, the first actual width value of the first reference object and the second actual width value of the object to be placed, the terminal device displays the target image of the object to be placed at the position of the image of the target vacancy according to the second width value, namely, the situation of 'placing the object to be placed at the position of the target vacancy' is simulated, and the situation can reflect the real situation of 'placing the object to be placed at the position of the target vacancy'; therefore, a user can more accurately and conveniently judge whether the object to be placed is placed at the position of the target vacancy according to the condition displayed on the current interface of the terminal equipment, so that the problem that the object to be placed is scratched by other objects beside the position of the target vacancy when the user places the object to be placed at the position of the target vacancy can be avoided.
Drawings
Fig. 1 is a schematic structural diagram of an android operating system according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for displaying an image according to an embodiment of the present invention;
fig. 3 is a first schematic view of an example interface of a mobile phone according to an embodiment of the present invention;
fig. 4 is a schematic diagram of an example interface of a mobile phone according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a method for displaying an image according to an embodiment of the present invention;
fig. 6 is a third schematic view of an example interface of a mobile phone according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like, in the description and in the claims of embodiments of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first actual width value and the second actual width value, etc. are used to distinguish different actual width values, rather than to describe a particular order of actual width values. In the description of the embodiments of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The embodiment of the invention provides an image display method and terminal equipment, which can be applied to the process of displaying a target image of an object to be placed by the terminal equipment. Specifically, the method can be applied to a process in which a user displays a target image of an object to be placed according to the position of the image of the target vacancy, so as to determine whether the object to be placed can be placed on the target vacancy. The embodiment of the invention can solve the problem that in the prior art, when a user parks a vehicle in an empty parking space, the user scratches and rubs the vehicle and other obstacles beside the empty parking space due to the fact that the user has a judgment error when judging whether the user can put the vehicle in the empty parking space.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the image display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention, in fig. 1, the architecture of the android operating system includes 4 layers, which are an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, L inux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of the android operating system and belongs to the lowest layer of a software layer of the android operating system, and the kernel layer provides core system services and hardware-related drivers for the android operating system based on L inux kernels.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image display method may be run based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the image display method provided by the embodiment of the invention by running the software program in the android operating system.
In a first embodiment of the present invention, fig. 2 illustrates an image display method provided in an embodiment of the present invention, which can be applied to a terminal device having an android operating system as illustrated in fig. 1. Wherein, although a logical order of methods of image display provided by embodiments of the present invention is shown in a method flow diagram, in some cases, the steps shown or described may be performed in an order different than here. Specifically, as shown in fig. 2, the method for displaying the image includes the following steps 201 to 204.
Step 201, under the condition that the first image is displayed on the current interface of the terminal device, the terminal device acquires a first width value.
In an embodiment of the present invention, the first image includes an image of at least one reference object and an image of the target space, the first width value is a width value of the image of the first reference object displayed on the current interface, and the first reference object is one of the at least one reference object.
In an application scenario of the embodiment of the present invention, before a user places an object to be placed on a target vacancy, a first image may be obtained through a camera of a terminal device; the terminal device displays the first image (the first image comprises an image of at least one reference object and an image of the target vacancy) on a current interface of the terminal device, and measures a width value of any one of the images of the at least one reference object (such as the image of the first reference object) to obtain a first width value.
Optionally, in the embodiment of the present invention, the reference object may be a vehicle, an intelligent machine, or the like.
Optionally, in this embodiment of the present invention, a user may turn on a camera (or a video camera) function in the terminal device to display the first image on the current interface, and acquire the first width value when the terminal device is triggered to turn on the "vacancy detection mode" function.
It is understood that the value of the width of the object generally refers to the distance between the two points of the object itself that are most protruding on the outside.
For example, the terminal device is taken as a mobile phone, and the reference object is taken as a reference vehicle. As shown in fig. 3 (a), the mobile phone 10 displays icons of a "camera" Application (APP), an alarm Application, a calendar Application, and the like on the home interface 11 of the mobile phone 10, and after a user operates (e.g., clicks or presses) the icon of the "camera" APP on the home interface 11, as shown in fig. 3 (b), the home interface 12 of the "camera" APP is displayed on the current interface of the terminal device, icons and keywords for user function selection, such as "video", "photo", "panorama", and "vacancy detection mode", are displayed on the home interface 12, and the user aligns the camera of the mobile phone 10 at a certain position, as shown in fig. 3 (c), and an interface 13 is displayed on the current interface of the mobile phone 10, and a first image (e.g., an image 14, a keyword, a reference vehicle image 14, a reference image, an image 15 of the reference vehicle and the target slot 16), after the user operates the "slot detection mode" on the home interface 13 (e.g., clicking or pressing operation), as shown in (d) of fig. 3, the interface 17 is displayed on the current interface of the mobile phone 10, the image 14 of the reference vehicle, the image 15 of the reference vehicle and the target slot 16 are displayed on the interface 17, and the mobile phone 10 measures the width value d1 of the image of the first reference vehicle (e.g., the image 14 of the reference vehicle) displayed on the interface 17, that is, the first width value is d 1.
It should be noted that, in the embodiment of the present invention, the target blank space 16 in (c) in fig. 3 and (d) in fig. 3 is represented by a dashed box, and the dashed box is only for convenience of understanding and description, and the mobile phone 10 may not display the dashed box when displaying the first image.
Optionally, in the embodiment of the present invention, the user may also open an APP in the "vacancy detection mode" to trigger the terminal device to display the first image on the current interface, and obtain the first width value.
It is understood that, in the embodiment of the present invention, the terminal device may also intelligently recognize the first image in a case where the first image is displayed on the current interface, so as to determine whether to acquire the first width value.
Step 202, the terminal device obtains a first actual width value of the first reference object and a second actual width value of the object to be placed.
It should be noted that, in the embodiment of the present invention, the actual width value refers to the actual width value of the object (such as the reference object and the object to be placed) itself.
Optionally, in this embodiment of the present invention, the method for acquiring, by the terminal device, the first actual width value of the first reference object in the step 202 may specifically be implemented by the following step 202a and step 202 b.
Step 202a, the terminal device performs image recognition on the image of the first reference object to obtain first characteristic information of the first reference object.
Optionally, in this embodiment of the present invention, the first feature information may include a name and a model of the first reference object.
In the embodiment of the invention, the terminal equipment can perform image recognition on the image of the first reference object to obtain the name and the model of the first reference object, and acquire the first actual width value according to the name and the model of the first reference object.
For example, the reference object is taken as a reference vehicle to be described herein. The terminal device performs image recognition on the image of the first reference vehicle to obtain the name (such as the brand) and the model of the first reference vehicle, and searches the first actual width value of the first reference vehicle of the brand and the model in the terminal device according to the brand and the model of the first reference vehicle, or the terminal device queries some websites through a mobile network to obtain the first actual width value of the first reference vehicle of the brand and the model.
For example, referring to (d) in fig. 3, after acquiring the first width value d1, the mobile phone 10 performs image recognition on the image 14 of the reference vehicle, so that the brand of the first reference vehicle corresponding to the image 14 of the reference vehicle is "vehicle a" and the model is "320 i"; the mobile phone 10 queries the actual width value d2 of the vehicle with the brand of "vehicle a" and the model of "320 i" on some websites through a mobile network, and obtains the first actual width value d2 of the first reference vehicle.
Step 202b, the terminal device obtains a first actual width value according to the first characteristic information.
Optionally, in this embodiment of the present invention, the first feature information may include a name and a model of the first reference object, and the terminal device may obtain the first actual width value according to the name and the model of the first reference object.
Optionally, in this embodiment of the present invention, the method for acquiring, by the terminal device, the second actual width value of the object to be placed in the step 202 may specifically be implemented by the following step 202c and step 202 d.
Step 202c, the terminal device obtains second characteristic information of the object to be placed.
Optionally, in this embodiment of the present invention, the second feature information may include at least one of a name and a model of the object to be placed, and the second actual width value.
Optionally, in the embodiment of the present invention, the object to be placed may be a vehicle, an intelligent machine, or the like.
Optionally, in this embodiment of the present invention, the second feature information may be pre-stored in the terminal device by the terminal device; or, after executing step 201, the terminal device may display an input box on the current interface, where the input box is used to instruct the user to input the second feature information, and the terminal device obtains the second feature information according to the input of the user and stores the second feature information in the terminal device.
Step 202d, the terminal device obtains a second actual width value according to the second characteristic information.
Optionally, in the embodiment of the present invention, when the second feature information includes the name and the model of the object to be placed, the terminal device may search, according to the name and the model of the object to be placed, the second actual width value of the object to be placed of the name and the model in the terminal device, or the terminal device obtains, through a mobile network, the second actual width value of the object to be placed of the name and the model by querying on some websites.
For example, the object to be placed is taken as the vehicle to be placed. Assuming that the second feature information includes the brand and the model of the vehicle to be placed, the brand of the vehicle to be placed is "B vehicle" and the model is "free light", the mobile phone 10 queries the actual width value d3 of the vehicle with the brand of "B vehicle" and the model of "free light" on some websites through the mobile network, and obtains a second actual width value d3 of the vehicle to be placed.
Optionally, in the embodiment of the present invention, under the condition that the second feature information includes the second actual width value, the terminal device may directly obtain the second actual width value according to the second feature information of the object to be placed.
Optionally, in the embodiment of the present invention, under the condition that the second feature information includes the name and the model of the object to be placed, and the second actual width value, the terminal device may directly obtain the second actual width value according to the second feature information of the object to be placed.
And 203, the terminal device determines a second width value according to the first width value, the first actual width value and the second actual width value.
In the embodiment of the present invention, the second width value may be a width value of a target image of an object to be placed to be displayed on the current interface.
Optionally, in the embodiment of the present invention, the step 203 may be specifically implemented by the following step 203 a.
Step 203a, the terminal device adopts a preset formula according to the first width value, the first actual width value and the second actual width value
Figure BDA0001616000720000051
And calculating to obtain a second width value.
Wherein y is the second width value, a is the first actual width value, b is the second actual width value, and x is the first width value.
In the embodiment of the present invention, the terminal device may correspondingly substitute the acquired first width value, the acquired first actual width value, and the acquired second actual width value into the formula
Figure BDA0001616000720000052
And calculating to obtain a second width value.
Illustratively, the first width value of the image of the first reference vehicle acquired by the cell phone 10 is d1, the first actual width value of the first reference vehicle is d2, the second actual width value of the vehicle to be placed is d3, and the cell phone 10 correspondingly substitutes d1, d2 and d3 into a formula
Figure BDA0001616000720000053
In the step (2), the second width value is obtained by calculation
Figure BDA0001616000720000054
For example, assume that the first width value d1 is 30mm, the first actual width value d2 is 1728mm, and the second actual width value d3 is 1859 mm. The mobile phone 10 substitutes the correspondence of 30mm, 1728mm and 1859mm into the formula
Figure BDA0001616000720000055
In the step (2), the second width value is obtained by calculation
Figure BDA0001616000720000056
Optionally, in an implementation manner of the embodiment of the present invention, the second feature information at least includes a name and a model of the object to be placed, and the terminal device may obtain the original image of the object to be placed according to the name and the model of the object to be placed. Specifically, in the embodiment of the present invention, after step 203, the method for displaying an image according to the embodiment of the present invention further includes step 401 described below.
Step 401, the terminal device obtains an original image of the object to be placed according to the name and the model of the object to be placed.
Optionally, in the embodiment of the present invention, the terminal device may search, according to the name and the model of the object to be placed, the corresponding original image of the object to be placed in the terminal device; or, the terminal device may also obtain the original image of the object to be placed with the name and the model by querying on some websites through the mobile network according to the name and the model of the object to be placed.
It can be understood that, in the embodiment of the present invention, after acquiring the original image of the object to be placed, the terminal device may display the original image of the object to be placed on the current interface of the terminal.
And step 204, the terminal equipment displays the target image of the object to be placed at the position of the image of the target vacancy according to the second width value.
In the embodiment of the invention, after the terminal device displays the target image of the object to be placed at the position of the image of the target vacancy according to the second width value, the user can autonomously judge whether the object to be placed is placed at the position of the target vacancy according to the display condition on the current interface.
Illustratively, assume that the image of the target space is a dashed box 18 as shown in fig. 4 (a). As shown in fig. 4 (a), the mobile phone 10 displays the image 19 of the vehicle to be placed at the position of the dashed-line frame 18 according to the second width value y (for example, 32mm), and the user can determine the position where the vehicle to be placed corresponding to the image 19 of the vehicle to be placed can be placed in the target vacancy (the target vacancy corresponding to the dashed-line frame 18) according to the situation displayed on the current interface (for example, the image 19 of the vehicle to be placed is spaced from the image 14 of the reference vehicle and is spaced from the image 15 of the reference vehicle).
Illustratively, assume that the image of the target space is a dashed box 20 as shown in fig. 4 (b). As shown in fig. 4 (b), the mobile phone 10 displays the image 19 of the vehicle to be placed at the position of the dashed-line frame 20 according to the second width value y (for example, 32mm), and the user can determine that the vehicle to be placed corresponding to the image 19 of the vehicle to be placed cannot be placed at the position of the target vacancy (the target vacancy corresponding to the dashed-line frame 20) according to the situation displayed on the current interface (for example, the image 19 of the vehicle to be placed has an overlapping portion with the image 14 of the reference vehicle and has an overlapping portion with the image 15 of the reference vehicle).
Optionally, in this embodiment of the present invention, in combination with step 401, step 204 may be specifically implemented by step 204a and step 204b described below.
And 204a, the terminal equipment adjusts the width value of the original image of the object to be placed into a second width value to obtain a target image of the object to be placed.
In the embodiment of the invention, after the terminal device acquires the original image of the object to be placed, the width value of the original image of the object to be placed can be adjusted to the second width value, so that the target image of the object to be placed is obtained.
And step 204b, the terminal equipment displays the target image of the object to be placed at the position of the image of the target vacancy.
Optionally, in the embodiment of the present invention, the terminal device may display the target image of the object to be placed at the position of the image of the target vacancy by using an Augmented Reality (AR) technology.
The embodiment of the invention provides an image display method, and terminal equipment can determine the width value of a target image of an object to be placed to be displayed on a current interface and display the target image of the object to be placed at the position of an image of a target vacancy according to a second width value. Because the second width value is determined by the terminal device according to the functional relationship among the first width value of the image of the first reference object, the first actual width value of the first reference object and the second actual width value of the object to be placed, the terminal device displays the target image of the object to be placed at the position of the image of the target vacancy according to the second width value, namely, the situation of 'placing the object to be placed at the position of the target vacancy' is simulated, and the situation can reflect the real situation of 'placing the object to be placed at the position of the target vacancy'; therefore, a user can more accurately and conveniently judge whether the object to be placed is placed at the position of the target vacancy according to the condition displayed on the current interface of the terminal equipment, so that the problem that the object to be placed is scratched by other objects beside the position of the target vacancy when the user places the object to be placed at the position of the target vacancy can be avoided.
Optionally, in the embodiment of the present invention, as shown in fig. 5 with reference to fig. 2, after step 204, the method for displaying an image according to the embodiment of the present invention further includes step 501 described below.
Step 501, the terminal device displays the prompt information on the current interface.
In the embodiment of the invention, the prompt message can be used for prompting the user whether the object to be placed can be placed at the position of the target vacancy.
In the embodiment of the invention, after the terminal device displays the target image of the object to be placed at the position of the image of the target vacancy, whether the object to be placed can be placed at the position of the target vacancy can be judged, and a prompt message is displayed on the current interface according to the judgment result so as to prompt a user whether the object to be placed can be placed at the position of the target vacancy.
Optionally, in the embodiment of the present invention, the terminal device may determine whether the object to be placed can be placed at the position of the target vacancy by comparing a size relationship between the second width value and the width value of the image of the target vacancy.
Optionally, in the embodiment of the present invention, when the second width value is greater than or equal to the width value of the image of the target vacancy, the prompt information may be used to prompt the user that the object to be placed cannot be placed at the position of the target vacancy; and under the condition that the second width value is smaller than the preset threshold, the prompt message can be used for prompting the user to place the object to be placed at the position of the target vacancy, and the preset threshold is smaller than the width value of the image of the target vacancy.
Optionally, in the embodiment of the present invention, the preset threshold may be input by a user, for example, the user may determine a numerical value (i.e., the preset threshold) according to the size of the object to be placed, and input the numerical value on the terminal device. Alternatively, the preset threshold may be calculated in advance by the terminal device, for example, the terminal device may calculate a value smaller than the width value of the image of the target vacancy (i.e. the preset threshold) in advance according to the width value of the image of the target vacancy, so as to avoid that the object to be placed is too close to the reference objects on both sides of the object to be placed when the object to be placed is placed at the position of the target vacancy.
Optionally, in the embodiment of the present invention, the terminal device may also perform image recognition according to a condition displayed on the current interface, and determine whether the position of the target vacancy is capable of placing the object to be placed by recognizing whether there is an overlapping portion between the image of the object to be placed and each reference object in the at least one reference object and a distance between the image of the object to be placed and each reference object.
Optionally, in the embodiment of the present invention, in a case that the image of the object to be placed and each reference object in the at least one reference object have an overlapping portion, the prompt information may be used to prompt the user that the object to be placed cannot be placed at the position of the target empty location; in the case that the image of the object to be placed and each of the at least one reference object do not have an overlapping portion, and the second width value is smaller than the preset threshold value, the prompt message may be used to prompt the user to place the object to be placed at the position of the target empty location.
For example, referring to (a) in fig. 4, as shown in (a) in fig. 6, the mobile phone 10 determines that the position of the target space can be used to place the vehicle to be placed (the vehicle to be placed corresponding to the image 19 of the vehicle to be placed), and the mobile phone 10 displays a prompt box 21 containing a prompt message "the position of the target space can be used to place the vehicle to be placed" on the current display interface, so as to prompt the user that the vehicle to be placed can be placed at the position of the target space.
For example, referring to fig. 4 (b), as shown in fig. 6 (b), the mobile phone 10 determines that the vehicle to be placed (the vehicle to be placed corresponding to the image 19 of the vehicle to be placed) cannot be placed in the target space, and the mobile phone 10 displays a prompt box 22 containing a prompt message "the position of the target space is too narrow to place the vehicle to be placed" on the current display interface, so as to prompt the user that the vehicle to be placed cannot be placed in the position of the target space.
The terminal equipment can display a prompt message on the current interface after displaying the target image of the object to be placed at the position of the image of the target vacancy so as to prompt a user whether to place the object to be placed at the position of the target vacancy; therefore, the user can more intuitively determine whether to place the object to be placed at the position of the target vacant position.
In a second embodiment of the present invention, fig. 7 shows a schematic diagram of a possible structure of a terminal device involved in the embodiment of the present invention, and as shown in fig. 7, the terminal device 70 may include: an acquisition unit 71, a determination unit 72, and a display unit 73.
Wherein, the obtaining unit 71 is configured to obtain a first width value in a case where a first image is displayed on the current interface of the terminal device 70, the first image including an image of at least one reference object and an image of the target blank, the first width value being a width value of an image of a first reference object displayed on the current interface, the first reference object being one of the at least one reference object. The obtaining unit 71 is further configured to obtain a first actual width value of the first reference object and a second actual width value of the object to be placed. The determining unit 72 is configured to determine a second width value according to the first width value, the first actual width value, and the second actual width value acquired by the acquiring unit 71, where the second width value is a width value of a target image of the object to be placed, which is to be displayed on the current interface. A display unit 73 for displaying the target image of the object to be placed at the position of the image of the target space according to the second width value determined by the determination unit 72.
In a possible implementation manner, the obtaining unit 71 is specifically configured to: performing image recognition on an image of a first reference object to obtain first characteristic information of the first reference object, wherein the first characteristic information comprises the name and the model of the first reference object; and acquiring a first actual width value according to the first characteristic information. The obtaining unit 71 is specifically configured to: acquiring second characteristic information of the object to be placed, wherein the second characteristic information comprises at least one of the name and the model of the object to be placed and a second actual width value; and acquiring a second actual width value according to the second characteristic information.
In a possible implementation manner, the second characteristic information at least includes a name and a model of the object to be placed. Correspondingly, the obtaining unit 71 is further configured to obtain an original image of the object to be placed according to the name and the model of the object to be placed after the determining unit 72 determines the second width value according to the first width value, the first actual width value, and the second actual width value. The display unit 73 is specifically configured to: adjusting the width value of the original image of the object to be placed to be a second width value to obtain a target image of the object to be placed; and displaying the target image of the object to be placed at the position of the image of the target vacancy.
In a possible implementation manner, the determining unit 72 is specifically configured to: according to the first width value, the first actual width value and the second actual width value, adopting a preset formula
Figure BDA0001616000720000081
And calculating to obtain a second width value, wherein y is the second width value, a is the first actual width value, b is the second actual width value, and x is the first width value.
In a possible implementation manner, the display unit 73 is further configured to display a prompt message on the current interface after displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value determined by the determination unit 72, where the prompt message is used for prompting the user whether the user can place the object to be placed at the position of the target vacancy; when the second width value is larger than or equal to the width value of the image of the target vacancy, the prompt information is used for prompting a user that the object to be placed cannot be placed at the position of the target vacancy; and under the condition that the second width value is smaller than the preset threshold value, the prompt information is used for prompting the user to place the object to be placed at the position of the target vacancy, and the preset threshold value is smaller than the width value of the image of the target vacancy.
The terminal device 70 provided in the embodiment of the present invention can implement each process implemented by the terminal device in the foregoing method embodiments, and for avoiding repetition, detailed descriptions are not repeated here.
The embodiment of the invention provides a terminal device, which can determine the width value of a target image of an object to be placed, which is to be displayed on a current interface, and display the target image of the object to be placed at the position of an image of a target vacancy according to a second width value. Because the second width value is determined by the terminal device according to the functional relationship among the first width value of the image of the first reference object, the first actual width value of the first reference object and the second actual width value of the object to be placed, the terminal device displays the target image of the object to be placed at the position of the image of the target vacancy according to the second width value, namely, the situation of 'placing the object to be placed at the position of the target vacancy' is simulated, and the situation can reflect the real situation of 'placing the object to be placed at the position of the target vacancy'; therefore, a user can more accurately and conveniently judge whether the object to be placed is placed at the position of the target vacancy according to the condition displayed on the current interface of the terminal equipment, so that the problem that the object to be placed is scratched by other objects beside the position of the target vacancy when the user places the object to be placed at the position of the target vacancy can be avoided.
In a third embodiment of the present invention, fig. 8 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention. As shown in fig. 8, the terminal device 100 includes but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111.
It should be noted that the terminal device structure shown in fig. 8 does not constitute a limitation of the terminal device, and the terminal device may include more or less components than those shown, or combine some components, or arrange different components, as will be understood by those skilled in the art. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 110 is configured to, in a case that a first image is displayed on a current interface of the terminal device, obtain a first width value, where the first image includes an image of at least one reference object and an image of the target vacancy, the first width value is a width value of an image of a first reference object displayed on the current interface, and the first reference object is one of the at least one reference object; acquiring a first actual width value of a first reference object and a second actual width value of an object to be placed; determining a second width value according to the first width value, the first actual width value and the second actual width value, wherein the second width value is the width value of a target image of an object to be placed to be displayed on a current interface; and displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value.
The embodiment of the invention provides a terminal device, which can determine the width value of a target image of an object to be placed, which is to be displayed on a current interface, and display the target image of the object to be placed at the position of an image of a target vacancy according to a second width value. Because the second width value is determined by the terminal device according to the functional relationship among the first width value of the image of the first reference object, the first actual width value of the first reference object and the second actual width value of the object to be placed, the terminal device displays the target image of the object to be placed at the position of the image of the target vacancy according to the second width value, namely, the situation of 'placing the object to be placed at the position of the target vacancy' is simulated, and the situation can reflect the real situation of 'placing the object to be placed at the position of the target vacancy'; therefore, a user can more accurately and conveniently judge whether the object to be placed is placed at the position of the target vacancy according to the condition displayed on the current interface of the terminal equipment, so that the problem that the object to be placed is scratched by other objects beside the position of the target vacancy when the user places the object to be placed at the position of the target vacancy can be avoided.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a liquid Crystal Display (L acquired Crystal Display, L CD), an Organic light-Emitting Diode (O L ED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 8, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program is executed by the processor 110 to implement each process of the foregoing method embodiment, and can achieve the same technical effect, and for avoiding repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method of image display, the method comprising:
under the condition that a first image is displayed on a current interface of a terminal device, acquiring a first width value, wherein the first image comprises an image of at least one reference object and an image of a target vacancy, the first width value is the width value of the image of the first reference object displayed on the current interface, and the first reference object is one of the at least one reference object;
acquiring a first actual width value of the first reference object and a second actual width value of the object to be placed;
determining a second width value according to the first width value, the first actual width value and the second actual width value, wherein the second width value is the width value of a target image of the object to be placed to be displayed on the current interface;
and displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value.
2. The method of claim 1, wherein said obtaining a first actual width value of said first reference object comprises:
performing image recognition on the image of the first reference object to obtain first characteristic information of the first reference object, wherein the first characteristic information comprises the name and the model of the first reference object;
acquiring the first actual width value according to the first characteristic information;
the acquiring of the second actual width value of the object to be placed comprises:
acquiring second characteristic information of the object to be placed, wherein the second characteristic information comprises at least one of the name and the model of the object to be placed and the second actual width value;
and acquiring the second actual width value according to the second characteristic information.
3. The method according to claim 2, wherein the second characteristic information includes at least a name and a model of the object to be placed;
after determining a second width value according to the first width value, the first actual width value, and the second actual width value, the method further includes:
acquiring an original image of the object to be placed according to the name and the model of the object to be placed;
displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value, wherein the displaying comprises:
adjusting the width value of the original image of the object to be placed to be the second width value to obtain a target image of the object to be placed;
and displaying the target image of the object to be placed at the position of the image of the target vacancy.
4. The method of claim 1, wherein determining a second width value based on the first width value, the first actual width value, and the second actual width value comprises:
according to the first width value, the first actual width value and the second actual width value, adopting a preset formula
Figure FDA0002516988980000011
And calculating to obtain the second width value, wherein y is the second width value, a is the first actual width value, b is the second actual width value, and x is the first width value.
5. The method of any of claims 1-4, wherein after displaying the target image of the object to be placed at the location of the image of the target slot according to the second width value, the method further comprises:
displaying prompt information on the current interface, wherein the prompt information is used for prompting a user whether the object to be placed can be placed at the position of the target vacancy;
when the second width value is greater than or equal to the width value of the image of the target vacancy, the prompt message is used for prompting a user that the object to be placed cannot be placed at the position of the target vacancy; and under the condition that the second width value is smaller than a preset threshold value, the prompt information is used for prompting a user to place the object to be placed at the position of the target vacancy, and the preset threshold value is smaller than the width value of the image of the target vacancy.
6. A terminal device, characterized in that the terminal device comprises:
an obtaining unit, configured to obtain a first width value when a first image is displayed on a current interface of the terminal device, where the first image includes an image of at least one reference object and an image of a target vacancy, the first width value is a width value of an image of a first reference object displayed on the current interface, and the first reference object is one of the at least one reference object;
the acquisition unit is further used for acquiring a first actual width value of the first reference object and a second actual width value of the object to be placed;
a determining unit, configured to determine a second width value according to the first width value, the first actual width value, and the second actual width value acquired by the acquiring unit, where the second width value is a width value of a target image of the object to be placed, which is to be displayed on the current interface;
and the display unit is used for displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value determined by the determination unit.
7. The terminal device according to claim 6, wherein the obtaining unit is specifically configured to:
performing image recognition on the image of the first reference object to obtain first characteristic information of the first reference object, wherein the first characteristic information comprises the name and the model of the first reference object;
acquiring the first actual width value according to the first characteristic information;
the obtaining unit is specifically configured to:
acquiring second characteristic information of the object to be placed, wherein the second characteristic information comprises at least one of the name and the model of the object to be placed and the second actual width value;
and acquiring the second actual width value according to the second characteristic information.
8. The terminal device according to claim 7, wherein the second characteristic information includes at least a name and a model of the object to be placed;
the obtaining unit is further configured to obtain an original image of the object to be placed according to the name and the model of the object to be placed after the determining unit determines the second width value according to the first width value, the first actual width value and the second actual width value;
the display unit is specifically configured to:
adjusting the width value of the original image of the object to be placed to be the second width value to obtain a target image of the object to be placed;
and displaying the target image of the object to be placed at the position of the image of the target vacancy.
9. The terminal device according to claim 6, wherein the determining unit is specifically configured to:
according to the first width value, the first actual width value and the second actual width value, adopting a preset formula
Figure FDA0002516988980000021
And calculating to obtain the second width value, wherein y is the second width value, a is the first actual width value, b is the second actual width value, and x is the first width value.
10. The terminal device according to any one of claims 6 to 9, wherein the display unit is further configured to display a prompt message on the current interface after displaying the target image of the object to be placed at the position of the image of the target vacancy according to the second width value, wherein the prompt message is used for prompting a user whether the object to be placed can be placed at the position of the target vacancy;
when the second width value is greater than or equal to the width value of the image of the target vacancy, the prompt message is used for prompting a user that the object to be placed cannot be placed at the position of the target vacancy; and under the condition that the second width value is smaller than a preset threshold value, the prompt information is used for prompting a user to place the object to be placed at the position of the target vacancy, and the preset threshold value is smaller than the width value of the image of the target vacancy.
CN201810285989.3A 2018-04-03 2018-04-03 Image display method and terminal equipment Active CN108762602B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810285989.3A CN108762602B (en) 2018-04-03 2018-04-03 Image display method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810285989.3A CN108762602B (en) 2018-04-03 2018-04-03 Image display method and terminal equipment

Publications (2)

Publication Number Publication Date
CN108762602A CN108762602A (en) 2018-11-06
CN108762602B true CN108762602B (en) 2020-07-21

Family

ID=63980642

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810285989.3A Active CN108762602B (en) 2018-04-03 2018-04-03 Image display method and terminal equipment

Country Status (1)

Country Link
CN (1) CN108762602B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102241252A (en) * 2010-05-12 2011-11-16 大众汽车有限公司 Method for parking a vehicle or leaving a parking space and corresponding assistance system and vehicle
CN106991723A (en) * 2015-10-12 2017-07-28 莲嚮科技有限公司 Interactive house browsing method and system of three-dimensional virtual reality

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4162836A (en) * 1978-06-21 1979-07-31 Polaroid Corporation Electronic flash inhibit arrangement
CN103063143B (en) * 2012-12-03 2016-05-11 苏州佳世达电通有限公司 Measuring method based on image recognition and system thereof
CN104837683A (en) * 2012-12-12 2015-08-12 本田技研工业株式会社 Parking space detector
CN103777757B (en) * 2014-01-15 2016-08-31 天津大学 A kind of place virtual objects in augmented reality the system of combination significance detection
US9940908B2 (en) * 2015-09-29 2018-04-10 Aisin Seiki Kabushiki Kaisha Display control device
CN107274266A (en) * 2017-06-09 2017-10-20 北京小米移动软件有限公司 Method of Commodity Recommendation and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102241252A (en) * 2010-05-12 2011-11-16 大众汽车有限公司 Method for parking a vehicle or leaving a parking space and corresponding assistance system and vehicle
CN106991723A (en) * 2015-10-12 2017-07-28 莲嚮科技有限公司 Interactive house browsing method and system of three-dimensional virtual reality

Also Published As

Publication number Publication date
CN108762602A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108182043B (en) Information display method and mobile terminal
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
CN108446058B (en) Mobile terminal operation method and mobile terminal
CN111142991A (en) Application function page display method and electronic equipment
CN108763317B (en) Method for assisting in selecting picture and terminal equipment
CN111142723B (en) Icon moving method and electronic equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN110752981B (en) Information control method and electronic equipment
CN108536509B (en) Application body-splitting method and mobile terminal
CN110750189B (en) Icon display method and device
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN107741814B (en) Display control method and mobile terminal
CN110703972B (en) File control method and electronic equipment
CN109523253B (en) Payment method and device
CN109144393B (en) Image display method and mobile terminal
CN108984066B (en) Application icon display method and mobile terminal
CN110908750B (en) Screen capturing method and electronic equipment
CN110096203B (en) Screenshot method and mobile terminal
CN109859718B (en) Screen brightness adjusting method and terminal equipment
CN109189514B (en) Terminal device control method and terminal device
CN111061446A (en) Display method and electronic equipment
CN109067975B (en) Contact person information management method and terminal equipment
CN111427644B (en) Target behavior identification method and electronic equipment
CN110740265B (en) Image processing method and terminal equipment
CN110769162B (en) Electronic equipment and focusing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant