CN108596971B - Image display method and device - Google Patents

Image display method and device Download PDF

Info

Publication number
CN108596971B
CN108596971B CN201810395072.9A CN201810395072A CN108596971B CN 108596971 B CN108596971 B CN 108596971B CN 201810395072 A CN201810395072 A CN 201810395072A CN 108596971 B CN108596971 B CN 108596971B
Authority
CN
China
Prior art keywords
user
image
terminal
candidate home
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810395072.9A
Other languages
Chinese (zh)
Other versions
CN108596971A (en
Inventor
付文君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201810395072.9A priority Critical patent/CN108596971B/en
Publication of CN108596971A publication Critical patent/CN108596971A/en
Application granted granted Critical
Publication of CN108596971B publication Critical patent/CN108596971B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to an image display method and device. The method comprises the following steps: receiving scanning data of candidate home furnishings sent by a first terminal of a first user; the scan data includes a scan image; invoking a camera of a second terminal of the second user to shoot a real image of the location of the second user; determining the position information to be placed of the candidate home in the second user; according to the position information of the candidate home at the second user location to be placed, synthesizing the scanned image of the candidate home with the real image of the second user location to obtain an augmented reality image; and displaying the augmented reality image on a display screen of the second terminal. According to the method and the device, under the condition that the candidate home is not required to be moved home, the users can intuitively feel the placement effect of the candidate home at the second user site through cooperation among the users located at different sites, probability of returning goods due to poor placement effect is reduced, convenience is brought to shopping of the users, and shopping experience is improved.

Description

Image display method and device
Technical Field
The disclosure relates to the technical field of terminals, and in particular relates to an image display method and device.
Background
Augmented reality (AR, augmented Reality) refers to a technique of calculating the position and angle of a camera image in real time and adding a corresponding image, which aims to seamlessly integrate a virtual world into the real world on a screen and interact. With the improvement of the computing capability of portable electronic products, the application of the AR technology is becoming wider and wider.
In practice, when a user selects a home in a market, the user usually considers factors such as the size, shape, material, brand and the like of the home, but the user cannot know the final effect of placing the home in the user's home, and the user can only judge the placement effect of the home by means of feeling and experience.
Disclosure of Invention
In order to overcome the problems in the related art, embodiments of the present disclosure provide an image display method and apparatus. The technical scheme is as follows:
according to a first aspect of an embodiment of the present disclosure, there is provided an image display method including:
receiving scanning data of candidate home furnishings sent by a first terminal of a first user; the scan data includes a scan image;
invoking a camera of a second terminal of a second user to shoot a real image of the location of the second user;
determining the position information to be placed of the candidate home at the second user;
according to the position information to be placed of the candidate home at the second user location, synthesizing the scanned image of the candidate home with the real image of the second user location to obtain an augmented reality image;
and displaying the augmented reality image on a display screen of the second terminal.
In one embodiment, the method further comprises:
and sending the augmented reality image to the first terminal.
In one embodiment, the determining the information about the location of the candidate home to be placed at the location of the second user includes:
displaying a real image of the location of the second user on a display screen of the second terminal, and displaying the scanning image of the candidate home in a superposition manner;
detecting an augmented reality instruction operation of the second user;
determining a target position of the scanning image of the candidate home on the display screen of the second terminal relative to the real image according to the augmented reality indication operation;
and determining the information of the target position as information of the position to be placed of the candidate home at the position of the second user.
In one embodiment, the scan data further includes size information; the determining the information of the to-be-placed position of the candidate home at the second user location includes:
searching a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home;
and determining the position information of the target area as the position information to be placed of the candidate home at the second user location.
In one embodiment, the method further comprises:
displaying a real image of the location of the second user on a display screen of the second terminal;
displaying a recommendation frame at a position corresponding to the target area on a display screen of the second terminal; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
According to a second aspect of the embodiments of the present disclosure, there is provided an image display apparatus including:
the receiving module is used for receiving the scanning data of the candidate home sent by the first terminal of the first user; the scan data includes a scan image;
the shooting module is used for calling a camera of a second terminal of the second user to shoot a real image of the location of the second user;
the determining module is used for determining the position information to be placed of the candidate home at the second user;
the synthesis module is used for synthesizing the scanned image of the candidate home with the real image of the second user location according to the information of the position to be placed of the candidate home at the second user location, so as to obtain an augmented reality image;
and the first display module is used for displaying the augmented reality image on a display screen of the second terminal.
In one embodiment, the method further comprises:
and the sending module is used for sending the augmented reality image to the first terminal.
In one embodiment, the determining module includes:
the display sub-module is used for displaying the real image of the second user location on the display screen of the second terminal and displaying the scanning image of the candidate home in a superposition manner;
a detection sub-module for detecting an augmented reality indication operation of the second user;
the first determining submodule is used for determining the target position of the scanning image of the candidate home relative to the real image on the display screen of the second terminal according to the augmented reality indication operation;
and the second determining submodule is used for determining the information of the target position as information of the position to be placed of the candidate home at the position of the second user.
In one embodiment, the scan data further includes size information; the determining module includes:
the searching sub-module is used for searching a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home;
and the third determining submodule is used for determining the position information of the target area as the position information to be placed of the candidate home at the second user.
In one embodiment, the method further comprises:
the second display module is used for displaying the real image of the second user location on the display screen of the second terminal;
a third display module, configured to display a recommendation frame at a position corresponding to the target area on a display screen of the second terminal; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
According to a third aspect of the embodiments of the present disclosure, there is provided an image display apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method according to any of the embodiments of the first aspect described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method of any of the embodiments of the first aspect described above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: according to the technical scheme, the first terminal sends the scanning image of the candidate home to the second terminal, the second terminal synthesizes the scanning image of the candidate home and the real image of the second user place according to the information of the position to be placed of the candidate home at the second user place, and an augmented reality image is obtained, namely, the placement effect of the candidate home at the second user place is achieved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating an image display method according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating an image display method according to an exemplary embodiment.
Fig. 3 is a block diagram of an image display apparatus according to an exemplary embodiment.
Fig. 4 is a block diagram of an image display apparatus according to an exemplary embodiment.
Fig. 5 is a block diagram of an image display apparatus according to an exemplary embodiment.
Fig. 6 is a block diagram of an image display apparatus according to an exemplary embodiment.
Fig. 7 is a block diagram of an image display apparatus according to an exemplary embodiment.
Fig. 8 is a block diagram of an image display apparatus according to an exemplary embodiment.
Fig. 9 is a block diagram of an image display apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
In practice, when a user selects a home in a market, the factors such as the size, shape, material and brand of the home are usually considered, but the user cannot know the final effect of the home in the user, and the user can only judge the placement effect of the home by means of feeling and experience, so that the problem that the probability of returning goods is high due to poor actual placement effect exists, inconvenience is brought to shopping of the user, and the shopping experience of the user is poor.
In order to solve the above-mentioned problems, an embodiment of the present disclosure provides an image display method, including: receiving scanning data of candidate home furnishings sent by a first terminal of a first user; the scan data includes a scan image; invoking a camera of a second terminal of the second user to shoot a real image of the location of the second user; determining the position information to be placed of the candidate home in the second user; according to the position information of the candidate home at the second user location to be placed, synthesizing the scanned image of the candidate home with the real image of the second user location to obtain an augmented reality image; and displaying the augmented reality image on a display screen of the second terminal. According to the image display method provided by the embodiment of the disclosure, the first user sends the scanned image of the candidate home to the second user, the second user synthesizes the scanned image of the candidate home with the real image of the second user according to the information of the position to be placed of the candidate home at the second user position, and the augmented reality image is obtained, namely, the placement effect of the candidate home at the second user position is achieved.
Based on the above analysis, the following specific examples are presented.
FIG. 1 is a flowchart illustrating an image display method according to an exemplary embodiment; as shown in fig. 1, the method comprises the following steps 101-105:
in step 101, receiving scan data of candidate home appliances sent by a first terminal of a first user; the scan data includes a scan image.
By way of example, the execution subject of the method may be a second terminal, such as an AR device or the like. The first user and the second user are located at different locations, for example the first user may be located at a mall and the second user at home. A first user selects a home to be purchased in a mall, and then scans candidate home at different angles by using a first terminal, such as a camera of a mobile phone or AR equipment, so as to obtain scan data of the candidate home, wherein the scan data comprises a scan image; and the first user sends the scanning data of the candidate home to a second terminal of the second user through the first terminal.
In step 102, a camera of a second terminal of the second user is invoked to capture a real image of the location of the second user.
The second terminal invokes a camera of the terminal to capture a real image of the location of the second user, for example.
In step 103, information of the to-be-placed positions of the candidate home at the second user location is determined.
For example, the implementation manner of determining, by the second terminal, the to-be-placed position information of the candidate home at the location of the second user may at least include any one of the following manners:
mode 1), mode of designating a position to be put by the second user: the scanning data also comprises size information of candidate households; displaying a real image of the location of the second user on a display screen of the second terminal, and displaying a scanning image of the candidate home in a superposition manner on the display screen according to the size information of the candidate home; the second user can move the scanned image of the candidate home on the display screen through AR operation, and the scanned image of the candidate home is moved to a position to be placed expected by the second user. Specifically, the second terminal detects an AR indication operation of the second user; according to the AR indication operation, determining the target position of the scanning image of the candidate home on the display screen relative to the real image, and determining the information of the target position as the information of the position to be placed of the candidate home at the position of the second user.
Mode 2), the mode of automatically determining the position to be put by the second terminal: the scanning data also comprises size information of candidate households; the second terminal searches a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home; and determining the position information of the target area as the position information to be placed of the candidate home in the second user. For example, a real image of the location of the second user is displayed on a display screen of the second terminal; displaying a recommendation frame at a position corresponding to the target area on the display screen; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
In step 104, according to the information of the position to be placed of the candidate home at the second user location, the scanned image of the candidate home and the real image of the second user location are synthesized to obtain the augmented reality image.
By way of example, the second terminal synthesizes the scanned image of the candidate home with the real image of the second user location by using the AR technology according to the information of the to-be-placed position of the candidate home at the second user location, and obtains the augmented reality image. For example, in the augmented reality image, the scanned image of the candidate home is located in an area corresponding to the position information to be placed in the real image of the room.
In step 105, an augmented reality image is displayed on a display screen of the second terminal.
By way of example, the second terminal places the scanned image of the candidate home in a suitable place in the AR scene of the place where the second user is located according to the scanned data of the candidate home provided by the first terminal, and synthesizes the scanned image of the candidate home with the real image of the place where the second user is located through the AR technology, so that the second user can intuitively see the placement effect of the candidate home in the place where the second user is located.
According to the technical scheme provided by the embodiment of the disclosure, the first terminal sends the scanning image of the candidate home to the second terminal, the second terminal synthesizes the scanning image of the candidate home and the real image of the second user place according to the information of the position to be placed of the candidate home at the second user place, and an augmented reality image is obtained, namely, the placement effect of the candidate home at the second user place is achieved.
Fig. 2 is a flowchart illustrating an image display method according to an exemplary embodiment. As shown in fig. 2, on the basis of the embodiment shown in fig. 1, the image display method according to the present disclosure may include the following steps 201 to 210:
in step 201, a first terminal of a first user scans candidate home furnishings to obtain scan data of the candidate home furnishing, wherein the scan data includes a scan image.
By way of example, the present embodiment may be performed by a first terminal of a first user and a second terminal of a second user, for example, a mobile phone or an AR device. The scan data also includes size information of the candidate home. The first user and the second user are located at different locations, for example the first user may be located at a mall and the second user at home. A first user selects a home to be purchased in a mall, and then scans candidate home at different angles by using a camera of a first terminal of the first user to obtain scanning data of the candidate home, wherein the scanning data comprises scanning images; and the first user sends the scanning data of the candidate home to a second terminal of the second user through the first terminal.
In step 202, the first terminal sends the scan data of the candidate home to the second terminal of the second user.
In step 203, a second terminal of a second user receives scan data of candidate home sent by a first terminal; the second terminal calls a camera of the second terminal of the second user to shoot a real image of the location of the second user.
In step 204, the second terminal displays the real image of the location of the second user on the display screen of the second terminal, and superimposes and displays the scanned image of the candidate home.
In step 205, the second terminal detects an augmented reality instruction operation of the second user.
In step 206, the second terminal determines a target position of the scanned image of the candidate home with respect to the real image on the display screen of the second terminal according to the augmented reality instruction operation.
In step 207, the second terminal determines the information of the target position as information of the to-be-placed position of the candidate home at the location of the second user.
In step 208, the second terminal synthesizes the scanned image of the candidate home with the real image of the second user location according to the information of the to-be-placed position of the candidate home at the second user location, so as to obtain the augmented reality image.
For example, in the augmented reality image, the scanned image of the candidate home is located in an area corresponding to the position information to be placed in the real image of the room.
In step 209, the second terminal displays the augmented reality image on a display screen of the second terminal.
In step 210, the second terminal transmits the augmented reality image to the first terminal.
For example, for a scenario in which a first user and a second user at different places shop cooperatively based on an augmented reality technology, for example, the first user is in a mall, the second user is at home, and the two users need to purchase home cooperatively. A first user selects a home to be purchased in a mall, and then uses a first terminal to scan candidate home at different angles; the first terminal shares the scanned candidate home data to a second user; the second terminal of the second user places the virtual image of the candidate home in a proper place in the home scene according to the scanning data provided by the first user, and checks the final effect of the candidate home in the home; and the second user shares the final effect of the candidate home to the first user through the second terminal, and then the two people discuss whether the candidate home is suitable or not together, and the like.
According to the technical scheme provided by the embodiment of the disclosure, the first terminal sends the scanning image of the candidate home to the second terminal, the second terminal synthesizes the scanning image of the candidate home and the real image of the second user place according to the information of the position to be placed of the candidate home at the second user place, and an augmented reality image is obtained, namely, the placement effect of the candidate home at the second user place is achieved.
The following are device embodiments of the present disclosure that may be used to perform method embodiments of the present disclosure.
Fig. 3 is a block diagram of an image display apparatus according to an exemplary embodiment; the apparatus may be implemented in various ways, for example, implementing all components of the apparatus in a terminal, or implementing components of the apparatus in a coupled manner on the terminal side; the apparatus may implement the methods related to the present disclosure described above through software, hardware, or a combination of both; as shown in fig. 3, the image display device includes: a receiving module 301, a shooting module 302, a determining module 303, a synthesizing module 304 and a first display module 305, wherein:
the receiving module 301 is configured to receive scan data of candidate home sent by a first terminal of a first user; the scan data includes a scan image;
the shooting module 302 is configured to call a camera of a second terminal of the second user to shoot a real image of the location of the second user;
the determining module 303 is configured to determine information about a to-be-placed position of the candidate home at the second user location;
the synthesizing module 304 is configured to synthesize the scanned image of the candidate home with the real image of the second user location according to the information of the to-be-placed position of the candidate home at the second user location, so as to obtain an augmented reality image;
the first display module 305 is configured to display an augmented reality image on a display screen of the second terminal.
The device provided in the embodiment of the present disclosure can be used to execute the technical solution of the embodiment shown in fig. 1, and the execution mode and the beneficial effects thereof are similar, and are not repeated here.
In one possible embodiment, as shown in fig. 4, the image display apparatus shown in fig. 3 may further include: the transmission module 401 is configured to transmit the augmented reality image to the first terminal.
In one possible embodiment, as shown in fig. 5, the image display apparatus shown in fig. 3 may further include a determining module 303 configured to include: a display sub-module 501, a detection sub-module 502, a first determination sub-module 503, and a second determination sub-module 504, wherein:
the display sub-module 501 is configured to display a real image of the location of the second user on a display screen of the second terminal, and superimpose and display a scanned image of the candidate home;
the detection sub-module 502 is configured to detect an augmented reality indication operation of the second user;
the first determining sub-module 503 is configured to determine a target position of the scanned image of the candidate home relative to the real image on the display screen of the second terminal according to the augmented reality instruction operation;
the second determining submodule 504 is configured to determine information of the target position as information of a to-be-placed position of the candidate home at the second user location.
In one possible embodiment, the scan data further includes size information; as shown in fig. 6, the image display apparatus shown in fig. 3 may further include a determination module 303 configured to include: a search sub-module 601 and a third determination sub-module 602, wherein:
the searching sub-module 601 is configured to search a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home;
the third determining submodule 602 is configured to determine the position information of the target area as the to-be-placed position information of the candidate home at the second user location.
In one possible embodiment, the image display apparatus may further include: the second display module and the third display module, wherein: the second display module is configured to display a real image of the location of the second user on a display screen of the second terminal; the third display module is configured to display a recommendation frame at a position corresponding to the target area on the display screen of the second terminal; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
Fig. 7 is a block diagram of an image display device 700 according to an exemplary embodiment, and the image display device 700 may be implemented in various ways, such as implementing all components of the device in a terminal, or implementing components of the device in a coupled manner on the terminal side; the image display apparatus 700 includes:
a processor 701;
a memory 702 for storing processor-executable instructions;
wherein the processor 701 is configured to:
receiving scanning data of candidate home furnishings sent by a first terminal of a first user; the scan data includes a scan image;
invoking a camera of a second terminal of the second user to shoot a real image of the location of the second user;
determining the position information to be placed of the candidate home in the second user;
according to the position information of the candidate home at the second user location to be placed, synthesizing the scanned image of the candidate home with the real image of the second user location to obtain an augmented reality image;
and displaying the augmented reality image on a display screen of the second terminal.
In one embodiment, the processor 701 may be further configured to:
and sending the augmented reality image to the first terminal.
In one embodiment, the processor 701 may be further configured to:
displaying a real image of the location of the second user on a display screen of the second terminal, and displaying a scanning image of the candidate home in a superposition manner;
detecting augmented reality instruction operation of a second user;
according to the augmented reality indication operation, determining a target position of a scanning image of the candidate home on a display screen of the second terminal relative to the real image;
and determining the information of the target position as the information of the position to be placed of the candidate home in the second user.
In one embodiment, the scan data further includes size information; the processor 701 may be further configured to:
searching a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home;
and determining the position information of the target area as the position information to be placed of the candidate home in the second user.
In one embodiment, the processor 701 may be further configured to:
displaying a real image of the location of the second user on a display screen of the second terminal;
displaying a recommendation frame at a position corresponding to the target area on a display screen of the second terminal; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 8 is a block diagram of an image display apparatus according to an exemplary embodiment; the image display device 800 is adapted to a terminal; the image display device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the image display device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the image display apparatus 800. Examples of such data include instructions for any application or method operating on image display device 800, contact data, phonebook data, messages, pictures, video, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the image display device 800. The power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the image display device 800.
The multimedia component 808 includes a screen between the image display device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. When the image display apparatus 800 is in an operation mode, such as a photographing mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the image display apparatus 800 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the image display device 800. For example, the sensor assembly 814 may detect an on/off state of the image display device 800, a relative positioning of the components, such as a display and a keypad of the image display device 800, the sensor assembly 814 may also detect a change in position of the image display device 800 or a component of the image display device 800, the presence or absence of a user's contact with the image display device 800, an orientation or acceleration/deceleration of the image display device 800, and a change in temperature of the image display device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the image display apparatus 800 and other devices. The image display apparatus 800 may access a wireless network based on a communication standard, such as WiFi,2G, or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the image display apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for executing the above method.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of image display device 800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Fig. 9 is a block diagram of an image display apparatus according to an exemplary embodiment. For example, the image display apparatus 900 may be provided as a server. The image display device 900 includes a processing component 902 that further includes one or more processors, and memory resources represented by a memory 903, for storing instructions, such as applications, executable by the processing component 902. The application program stored in the memory 903 may include one or more modules each corresponding to a set of instructions. Further, the processing component 902 is configured to execute instructions to perform the methods described above.
The image display device 900 may also include a power supply component 906 configured to perform power management of the image display device 900, a wired or wireless network interface 905 configured to connect the image display device 900 to a network, and an input output (I/O) interface 908. The image display apparatus 900 may operate based on an operating system stored in the memory 903, for example, windows server (tm), mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
A non-transitory computer readable storage medium, which when executed by a processor of an image display apparatus 800 or 900, enables the image display apparatus 800 or 900 to perform an image display method comprising:
receiving scanning data of candidate home furnishings sent by a first terminal of a first user; the scan data includes a scan image;
invoking a camera of a second terminal of the second user to shoot a real image of the location of the second user;
determining the position information to be placed of the candidate home in the second user;
according to the position information of the candidate home at the second user location to be placed, synthesizing the scanned image of the candidate home with the real image of the second user location to obtain an augmented reality image;
and displaying the augmented reality image on a display screen of the second terminal.
In one embodiment, the method further comprises:
and sending the augmented reality image to the first terminal.
In one embodiment, determining the information of the to-be-placed position of the candidate home at the location of the second user includes:
displaying a real image of the location of the second user on a display screen of the second terminal, and displaying a scanning image of the candidate home in a superposition manner;
detecting augmented reality instruction operation of a second user;
according to the augmented reality indication operation, determining a target position of a scanning image of the candidate home on a display screen of the second terminal relative to the real image;
and determining the information of the target position as the information of the position to be placed of the candidate home in the second user.
In one embodiment, the scan data further includes size information; determining information of the to-be-placed position of the candidate home in the second user location includes:
searching a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home;
and determining the position information of the target area as the position information to be placed of the candidate home in the second user.
In one embodiment, the method further comprises:
displaying a real image of the location of the second user on a display screen of the second terminal;
displaying a recommendation frame at a position corresponding to the target area on a display screen of the second terminal; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image display method, comprising:
receiving scanning data of candidate home furnishings sent by a first terminal of a first user; the scan data includes a scan image;
invoking a camera of a second terminal of a second user to shoot a real image of the location of the second user; the first user and the second user are located at different places;
determining the position information to be placed of the candidate home at the second user; the scan data also includes size information; the determining the information of the to-be-placed position of the candidate home at the second user location includes: the second terminal searches a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home, and determines the position information of the target area as the position information to be placed;
according to the position information to be placed of the candidate home at the second user location, synthesizing the scanned image of the candidate home with the real image of the second user location to obtain an augmented reality image;
and displaying the augmented reality image on a display screen of the second terminal.
2. The method according to claim 1, wherein the method further comprises:
and sending the augmented reality image to the first terminal.
3. The method of claim 1, wherein the determining the location information of the candidate home at the second user location comprises:
displaying a real image of the location of the second user on a display screen of the second terminal, and displaying the scanning image of the candidate home in a superposition manner;
detecting an augmented reality instruction operation of the second user;
determining a target position of the scanning image of the candidate home on the display screen of the second terminal relative to the real image according to the augmented reality indication operation;
and determining the information of the target position as information of the position to be placed of the candidate home at the position of the second user.
4. The method according to claim 1, wherein the method further comprises:
displaying a real image of the location of the second user on a display screen of the second terminal;
displaying a recommendation frame at a position corresponding to the target area on a display screen of the second terminal; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
5. An image display device, comprising:
the receiving module is used for receiving the scanning data of the candidate home sent by the first terminal of the first user; the scan data includes a scan image;
the shooting module is used for calling a camera of a second terminal of the second user to shoot a real image of the location of the second user; the first user and the second user are located at different places;
the determining module is used for determining the position information to be placed of the candidate home at the second user; the determining the information of the to-be-placed position of the candidate home in the second user comprises the following steps: the second terminal searches a target area matched with the size information of the candidate home in the real image of the location of the second user according to the size information of the candidate home, and determines the position information of the target area as the position information to be placed;
the synthesis module is used for synthesizing the scanned image of the candidate home with the real image of the second user location according to the information of the position to be placed of the candidate home at the second user location, so as to obtain an augmented reality image;
and the first display module is used for displaying the augmented reality image on a display screen of the second terminal.
6. The apparatus of claim 5, wherein the apparatus further comprises:
and the sending module is used for sending the augmented reality image to the first terminal.
7. The apparatus of claim 5, wherein the determining module comprises:
the display sub-module is used for displaying the real image of the second user location on the display screen of the second terminal and displaying the scanning image of the candidate home in a superposition manner;
a detection sub-module for detecting an augmented reality indication operation of the second user;
the first determining submodule is used for determining the target position of the scanning image of the candidate home relative to the real image on the display screen of the second terminal according to the augmented reality indication operation;
and the second determining submodule is used for determining the information of the target position as information of the position to be placed of the candidate home at the position of the second user.
8. The apparatus of claim 5, wherein the apparatus further comprises:
the second display module is used for displaying the real image of the second user location on the display screen of the second terminal;
a third display module, configured to display a recommendation frame at a position corresponding to the target area on a display screen of the second terminal; the recommendation frame is used for indicating the recommended placement position of the candidate home in the second user.
9. An image display device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1-4.
10. A computer readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the steps of the method of any of claims 1-4.
CN201810395072.9A 2018-04-27 2018-04-27 Image display method and device Active CN108596971B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810395072.9A CN108596971B (en) 2018-04-27 2018-04-27 Image display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810395072.9A CN108596971B (en) 2018-04-27 2018-04-27 Image display method and device

Publications (2)

Publication Number Publication Date
CN108596971A CN108596971A (en) 2018-09-28
CN108596971B true CN108596971B (en) 2024-03-19

Family

ID=63610341

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810395072.9A Active CN108596971B (en) 2018-04-27 2018-04-27 Image display method and device

Country Status (1)

Country Link
CN (1) CN108596971B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103105993A (en) * 2013-01-25 2013-05-15 腾讯科技(深圳)有限公司 Method and system for realizing interaction based on augmented reality technology
JP2014056486A (en) * 2012-09-13 2014-03-27 Hitachi Ltd Image network system, image display terminal, and image processing method
CN104205014A (en) * 2012-03-26 2014-12-10 索尼公司 Information processing apparatus, information processing method, and program
CN105278663A (en) * 2014-07-18 2016-01-27 南京专创知识产权服务有限公司 Augmented reality house experience system
CN106530404A (en) * 2016-11-09 2017-03-22 大连文森特软件科技有限公司 Inspection system of house for sale based on AR virtual reality technology and cloud storage
CN106991723A (en) * 2015-10-12 2017-07-28 莲嚮科技有限公司 Interactive house browsing method and system of three-dimensional virtual reality
CN107123013A (en) * 2017-03-01 2017-09-01 阿里巴巴集团控股有限公司 Exchange method and device under line based on augmented reality
CN107239997A (en) * 2017-06-01 2017-10-10 景德镇陶瓷大学 Self-service furniture house ornamentation design system
CN107330980A (en) * 2017-07-06 2017-11-07 重庆邮电大学 A kind of virtual furnishings arrangement system based on no marks thing
CN107656610A (en) * 2017-08-08 2018-02-02 触景无限科技(北京)有限公司 The control method and system of augmented reality
CN107909654A (en) * 2017-12-08 2018-04-13 快创科技(大连)有限公司 Home Fashion & Design Shanghai experiencing system based on AR technologies

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9342610B2 (en) * 2011-08-25 2016-05-17 Microsoft Technology Licensing, Llc Portals: registered objects as virtualized, personalized displays

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104205014A (en) * 2012-03-26 2014-12-10 索尼公司 Information processing apparatus, information processing method, and program
JP2014056486A (en) * 2012-09-13 2014-03-27 Hitachi Ltd Image network system, image display terminal, and image processing method
CN103105993A (en) * 2013-01-25 2013-05-15 腾讯科技(深圳)有限公司 Method and system for realizing interaction based on augmented reality technology
CN105278663A (en) * 2014-07-18 2016-01-27 南京专创知识产权服务有限公司 Augmented reality house experience system
CN106991723A (en) * 2015-10-12 2017-07-28 莲嚮科技有限公司 Interactive house browsing method and system of three-dimensional virtual reality
CN106530404A (en) * 2016-11-09 2017-03-22 大连文森特软件科技有限公司 Inspection system of house for sale based on AR virtual reality technology and cloud storage
CN107123013A (en) * 2017-03-01 2017-09-01 阿里巴巴集团控股有限公司 Exchange method and device under line based on augmented reality
CN107239997A (en) * 2017-06-01 2017-10-10 景德镇陶瓷大学 Self-service furniture house ornamentation design system
CN107330980A (en) * 2017-07-06 2017-11-07 重庆邮电大学 A kind of virtual furnishings arrangement system based on no marks thing
CN107656610A (en) * 2017-08-08 2018-02-02 触景无限科技(北京)有限公司 The control method and system of augmented reality
CN107909654A (en) * 2017-12-08 2018-04-13 快创科技(大连)有限公司 Home Fashion & Design Shanghai experiencing system based on AR technologies

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
AR interior designer:Automatic furniture arrangement using spatial and functional relationships;Jeff K.T. Tang et al;《2014 International Conference on Virtual Systems & Multimedia (VSMM)》;20141009;第1-8页 *
虚拟现实和增强现实技术在电子商务中的应用探究;汪明宇;《数字通信世界》;20171231;第183页 *

Also Published As

Publication number Publication date
CN108596971A (en) 2018-09-28

Similar Documents

Publication Publication Date Title
US9667774B2 (en) Methods and devices for sending virtual information card
CN109600303B (en) Content sharing method and device and storage medium
CN112114765A (en) Screen projection method and device and storage medium
EP3147802B1 (en) Method and apparatus for processing information
EP3223147A2 (en) Method for accessing virtual desktop and mobile terminal
US20180144546A1 (en) Method, device and terminal for processing live shows
EP3048508A1 (en) Methods, apparatuses and devices for transmitting data
CN107885016B (en) Holographic projection method and device
CN106792500B (en) Information output method and device and wearable device
CN110636318A (en) Message display method, message display device, client device, server and storage medium
US20170075671A1 (en) Method and apparatus for installing application and smart device using the same
CN111541922B (en) Method, device and storage medium for displaying interface input information
CN112363786A (en) Advertisement picture processing method and device, electronic equipment and storage medium
CN106506808B (en) Method and device for prompting communication message
US20170041377A1 (en) File transmission method and apparatus, and storage medium
CN109255839B (en) Scene adjustment method and device
US10516849B2 (en) Video call method, apparatus and system
CN107948876B (en) Method, device and medium for controlling sound box equipment
CN106845980B (en) Mobile payment method and device in virtual reality environment
CN108596971B (en) Image display method and device
CN111538543B (en) Lost article searching method, lost article searching device and storage medium
CN109391944B (en) Wireless network remarking method and device
US20170154318A1 (en) Information processing method, apparatus, and storage medium
CN113726905A (en) Data acquisition method, device and equipment based on home terminal equipment
CN109389547B (en) Image display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant