CN111949343A - Interface display method and device and electronic equipment - Google Patents

Interface display method and device and electronic equipment Download PDF

Info

Publication number
CN111949343A
CN111949343A CN201910408195.6A CN201910408195A CN111949343A CN 111949343 A CN111949343 A CN 111949343A CN 201910408195 A CN201910408195 A CN 201910408195A CN 111949343 A CN111949343 A CN 111949343A
Authority
CN
China
Prior art keywords
image
target
images
user interface
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910408195.6A
Other languages
Chinese (zh)
Inventor
颜深根
张祐纶
陈友将
李慧婷
叶安华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sensetime Intelligent Technology Co Ltd
Original Assignee
Shanghai Sensetime Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sensetime Intelligent Technology Co Ltd filed Critical Shanghai Sensetime Intelligent Technology Co Ltd
Priority to CN201910408195.6A priority Critical patent/CN111949343A/en
Publication of CN111949343A publication Critical patent/CN111949343A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an interface display method and device, wherein the method comprises the following steps: displaying a first user interface, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain diagram, and the first image comprises a plurality of areas corresponding to the plurality of second images; controlling information display of an object represented by a target image in the first user interface based on a received first user operation for a target area in a plurality of areas of the first image, wherein the target area corresponds to the target area and the target image is included in the plurality of second images.

Description

Interface display method and device and electronic equipment
Technical Field
The present disclosure relates to the field of interface display, and in particular, to an interface display method and apparatus, and an electronic device.
Background
With the development of science and technology and the increase of the number of similar products, how to better display and introduce products to customers becomes the focus of attention of merchants, however, the display effect and the user experience of the current common product display interface are poor.
Disclosure of Invention
The embodiment of the disclosure provides an interface display method and device and electronic equipment.
The interface display method provided by the embodiment of the disclosure comprises the following steps: displaying a first user interface, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain diagram, and the first image comprises a plurality of areas corresponding to the plurality of second images; controlling information display of an object represented by a target image in the first user interface based on a received first user operation for a target area in a plurality of areas of the first image, wherein the target area corresponds to the target area and the target image is included in the plurality of second images.
In one or more embodiments, the controlling the information display of the object represented by the target image in the first user interface based on the received first user operation for the target area in the plurality of areas of the first image includes: displaying a target image representing the object at a target location of the first user interface based on a received first user operation for a target region of the plurality of regions of the first image.
In one or more embodiments, the projection of the target image displayed at the target location on the screen is directly above the projection of the first image on the screen.
In one or more embodiments, the plurality of second images are arranged along a closed curve; the displaying a target image representing the object at a target location of the first user interface includes: controlling the plurality of second images to rotate along the closed curve until the target image reaches the target position.
In one or more embodiments, before receiving a first user operation for a target region of a plurality of regions of the first image, the plurality of second images are rotated along a closed curve in a first direction, the controlling the plurality of second images to be rotated along the closed curve until the target image reaches the target position includes: controlling the plurality of second images to continue to rotate along the closed curve in the first direction until the target image reaches the target position.
In one or more embodiments, the controlling the information display of the object represented by the target image in the first user interface based on the received first user operation for the target area in the plurality of areas of the first image includes: displaying, in the first user interface, an introduction page of an object represented by a target image based on a received first user operation for the target region of the plurality of regions of the first image.
In one or more embodiments, the displaying, in the first user interface, an introduction page of the object represented by the target image includes: and displaying the introduction page in the first user interface in a layer overlapping mode, wherein the introduction page is positioned on a second layer, and the second layer is overlapped on a first layer to which the target object belongs.
In one or more embodiments, the background of the introduction page is translucent or has a preset transparency to make the target image visible.
In one or more embodiments, in the first user interface displayed by overlapping the first layer and the second layer, the target image is located at a center position of the introduction page.
In one or more embodiments, the plurality of second images are arranged around the first image.
In one or more embodiments, the plurality of second images are distributed with a closed curve having the same central axis as the first image, and the first image is located below the closed curve.
In one or more embodiments, the first image is located at the center of a closed curve of the distribution of the plurality of second images.
In one or more embodiments, the second image includes a name and an icon of an object represented by the second image.
In one or more embodiments, at least two of the plurality of second images have different sizes or display effects to highlight portions of the second images.
In one or more embodiments, the plurality of second images includes a background image having hexagons, the bases of the hexagons having a specific pattern, and the specific pattern of the hexagonal bases of different second images are different in color.
In one or more embodiments, the plurality of second image represented objects includes at least one of: the system comprises a training system, a storage system, a multi-service gateway, an operation and maintenance management system, a network management system and an algorithm management system.
In one or more embodiments, the plurality of regions includes at least one of: frontal lobe region, brainstem region, telencephalon region, cerebellum region, occipital lobe region, and medulla oblongata region.
In one or more embodiments, the correspondence between the plurality of regions and the plurality of second images is determined based on biological functions to which the plurality of regions respectively correspond in the first image and functions that the subject respectively represented by the plurality of second images has.
In one or more implementations, the first user interface further includes a background image, wherein the background image includes a star field map.
In one or more embodiments, before displaying the first user interface, further comprising: displaying a second user interface, wherein the second user interface comprises a second brain diagram, wherein the size of the second brain diagram is larger than the size of the first brain diagram; converting the second user interface to the first user interface.
In one or more embodiments, the converting the second user interface to the first user interface includes: and carrying out reduction processing on the second brain schematic diagram to obtain the first brain schematic diagram, and displaying the plurality of second images.
In one or more embodiments, further comprising: in response to detecting a color change triggering event of a target area of the plurality of areas, changing a color of the target area to a corresponding color of the target area, wherein different areas have different corresponding colors.
In one or more embodiments, the color change triggering event of the target region includes at least one of: clicking the target image, turning the target image to a target position or a position close to the target position, and clicking the target area.
The interface display device that this disclosure embodiment provided includes: the display unit is used for displaying a first user interface, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain schematic diagram, and the first image comprises a plurality of areas corresponding to the plurality of second images; an interaction unit, configured to receive a first user operation for a target region in a plurality of regions of the first image; the display unit is used for controlling information display of an object represented by a target image in the first user interface based on received first user operation aiming at the target area in the plurality of areas of the first image, wherein the target image included in the plurality of second images corresponds to the target area.
In one or more embodiments, the display unit is configured to display a target image representing the object at a target position of the first user interface based on a received first user operation for a target area of the plurality of areas of the first image.
In one or more embodiments, the projection of the target image displayed at the target location on the screen is directly above the projection of the first image on the screen.
In one or more embodiments, the plurality of second images are arranged along a closed curve; the display unit is used for controlling the plurality of second images to rotate along the closed curve until the target image reaches the target position.
In one or more embodiments, the interaction unit rotates the plurality of second images along a closed curve in a first direction before receiving a first user operation for a target area of the plurality of areas of the first image, and the display unit controls the plurality of second images to continue to rotate along the closed curve in the first direction until the target image reaches the target position after receiving the first user operation for the target area of the plurality of areas of the first image.
In one or more embodiments, the display unit is configured to display, in the first user interface, an introduction page of an object represented by a target image based on a received first user operation for the target region of the plurality of regions of the first image.
In one or more embodiments, the display unit is configured to display the introduction page in the first user interface in a manner of overlaying layers, where the introduction page is located in a second layer, and the second layer is overlaid on a first layer to which the target object belongs.
In one or more embodiments, the background of the introduction page is translucent or has a preset transparency to make the target image visible.
In one or more embodiments, in the first user interface displayed by overlapping the first layer and the second layer, the target image is located at a center position of the introduction page.
In one or more embodiments, the plurality of second images are arranged around the first image.
In one or more embodiments, the plurality of second images are distributed with a closed curve having the same central axis as the first image, and the first image is located below the closed curve.
In one or more embodiments, the first image is located at the center of a closed curve of the distribution of the plurality of second images.
In one or more embodiments, the second image includes a name and an icon of an object represented by the second image.
In one or more embodiments, at least two of the plurality of second images have different sizes or display effects to highlight portions of the second images.
In one or more embodiments, the plurality of second images includes a background image having hexagons, the bases of the hexagons having a specific pattern, and the specific pattern of the hexagonal bases of different second images are different in color.
In one or more embodiments, the objects represented by the plurality of second images are different modules or components included in the same product.
In one or more embodiments, the plurality of second image represented objects includes at least one of: the system comprises a training system, a storage system, a multi-service gateway, an operation and maintenance management system, a network management system and an algorithm management system.
In one or more embodiments, the plurality of regions includes at least one of: frontal lobe region, brainstem region, telencephalon region, cerebellum region, occipital lobe region, and medulla oblongata region.
In one or more embodiments, the correspondence between the plurality of regions and the plurality of second images is determined based on biological functions to which the plurality of regions respectively correspond in the first image and functions that the subject respectively represented by the plurality of second images has.
In one or more implementations, the first user interface further includes a background image, wherein the background image includes a star field map.
In one or more embodiments, the display unit is configured to display a second user interface before displaying the first user interface, wherein the second user interface includes a second brain diagram, and wherein a size of the second brain diagram is larger than a size of the first brain diagram; converting the second user interface to the first user interface.
In one or more embodiments, the display unit is configured to perform reduction processing on the second brain diagram to obtain the first brain diagram, and display the plurality of second images.
In one or more embodiments, the display unit is configured to change a color of a target area of the plurality of areas to a corresponding color of the target area in response to detecting a color change triggering event of the target area, where different areas have different corresponding colors.
In one or more embodiments, the color change triggering event of the target region includes at least one of: clicking the target image, turning the target image to a target position or a position close to the target position, and clicking the target area.
The display device provided by the embodiment of the disclosure displays a first user interface, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain schematic diagram, and the plurality of second images rotate around.
In one or more embodiments, the plurality of second images are arranged along a closed curve along which the plurality of second images are rotated.
In one or more embodiments, a target image of the plurality of second images is rotated to a target position in response to a first user operation with respect to a target region of a plurality of regions of the first image.
In one or more embodiments, the projection of the target image displayed at the target location on the screen is directly above the projection of the first image on the screen.
In one or more embodiments, in response to a first user operation with respect to a target region of the plurality of regions of the first image, an introduction page of an object represented by the target image is displayed in the first user interface.
In one or more embodiments, the introduction page is displayed in the first user interface in a layer-overlapping manner, where the introduction page is located in a second layer, and the second layer is stacked on a first layer to which the target object belongs.
In one or more embodiments, the background of the introduction page is translucent or has a preset transparency.
In one or more embodiments, in the first user interface displayed by overlapping the first layer and the second layer, the target image is located at a center position of the introduction page.
In one or more embodiments, the plurality of second images are arranged around the first image.
In one or more embodiments, the plurality of second images are distributed with a closed curve having the same central axis as the first image, and the first image is located below the closed curve.
In one or more embodiments, the first image is located at the center of a closed curve of the distribution of the plurality of second images.
In one or more embodiments, the second image includes a name and an icon of an object represented by the second image.
In one or more embodiments, at least two of the plurality of second images have different sizes or display effects.
In one or more embodiments, the plurality of second images includes a background image having hexagons, the bases of the hexagons having a specific pattern, and the specific pattern of the hexagonal bases of different second images are different in color.
In one or more implementations, the first user interface further includes a background image, wherein the background image includes a star field map.
In one or more embodiments, the display device displays a second user interface, wherein the second user interface includes a second brain diagram, wherein a size of the second brain diagram is larger than a size of the first brain diagram.
In one or more embodiments, in response to detecting a color change triggering event of a target area of the plurality of areas, the color of the target area of the plurality of second images changes to a corresponding color of the target area, wherein different areas have different corresponding colors.
The computer program product provided by the embodiment of the disclosure comprises computer executable instructions, and after the computer executable instructions are executed, the interface display method can be realized.
The storage medium provided by the embodiment of the disclosure stores executable instructions, and the executable instructions are executed by the processor to realize the interface display method.
The electronic equipment provided by the embodiment of the disclosure comprises a memory and a processor, wherein the memory stores computer executable instructions, and the interface display method can be realized when the processor runs the computer executable instructions on the memory.
In the technical solution of the embodiment of the present disclosure, the first user interface of the computing platform includes a first image and a plurality of second images, the first image includes a first brain diagram, and the first image includes a plurality of regions corresponding to the plurality of second images. And in the case of obtaining a first user operation aiming at a target area in a plurality of areas of the first image, controlling information display of an object represented by a target image in the first user interface, wherein the target image included in the plurality of second images corresponds to the target area. The display interface of the computing platform is simple and attractive, a humanized interaction mode is provided for a user, and user experience is improved.
Drawings
Fig. 1 is a schematic flow chart of an interface display method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a first user interface provided by an embodiment of the present disclosure;
3-1 is another schematic view of a first user interface provided by an embodiment of the present disclosure;
3-2 is an introduction page of a training system provided by an embodiment of the present disclosure;
FIG. 4-1 is another schematic view of a first user interface provided by an embodiment of the present disclosure;
FIG. 4-2 is an introduction page of a memory system provided by an embodiment of the present disclosure;
FIG. 5-1 is another schematic view of a first user interface provided by an embodiment of the present disclosure;
FIG. 5-2 is an introduction page of the multi-service gateway of the first user interface provided by an embodiment of the present disclosure;
FIG. 6-1 is another schematic view of a first user interface provided by an embodiment of the present disclosure;
FIG. 6-2 is an introduction page of the operation and maintenance management system provided by the embodiment of the disclosure
FIG. 7-1 is another schematic view of a first user interface provided by an embodiment of the present disclosure;
fig. 7-2 is an introduction page of the network management system provided by the embodiment of the present disclosure;
FIG. 8-1 is another schematic view of a first user interface provided by an embodiment of the present disclosure;
FIG. 8-2 is an introduction page of an algorithm management system provided by an embodiment of the present disclosure;
8-3 are user interface change diagrams provided by embodiments of the present disclosure;
fig. 9 is a schematic structural diagram of an example of an interface display apparatus provided in an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an example of an electronic device according to an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
The embodiment of the present disclosure may be applied to terminal devices such as a fixed terminal/a mobile terminal, for example: mobile phones, tablet computers, game machines, desktop computers, all-in-one machines, vehicle-mounted terminals, and the like. In the disclosed embodiment, the terminal device may also be a wearable device. Wearable equipment can also be called wearable intelligent equipment, is the general term of applying wearable technique to carry out intelligent design, develop the equipment that can dress to daily wearing, like glasses, gloves, wrist-watch, dress and shoes etc.. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable smart device includes full functionality, large size, and can implement full or partial functionality without relying on a smart phone, such as: smart watches or smart glasses and the like, and only focus on a certain type of application functions, and need to be used in cooperation with other devices such as smart phones, such as various smart bracelets for physical sign monitoring, smart jewelry and the like.
The technical solution provided by the embodiment of the present disclosure may be applied to display of product introduction information, and may also be applied to display of other information, which is not limited in the embodiment of the present disclosure.
Fig. 1 is a schematic flow chart of an interface display method according to an embodiment of the present disclosure.
In 101, a first user interface is displayed, wherein the first user interface includes a first image and a plurality of second images, the first image includes a first brain diagram, and the first image includes a plurality of regions corresponding to the plurality of second images.
In an embodiment of the present disclosure, a first user interface is displayed via a display device. The display device may be a common display screen, a kiosk, a projector, a Virtual Reality (VR) device, an Augmented Reality (AR) device, or the like.
In an embodiment of the present disclosure, the first user interface includes a first image and a plurality of second images, and the first image includes a first brain diagram. In some optional embodiments, the first image further includes other images, such as a ring shape or a hexagon shape surrounding the first brain diagram, and the like, which is not limited in this disclosure.
In some alternative embodiments, the objects represented by the second images are different modules or components of the same product, or different products, where the product may be hardware or software or an integrated device combining hardware and software, and in one example, the objects represented by the second images are different modules or components in an artificial intelligence system or device, for example, the objects represented by the second images include one or any combination selected from the following group: the system comprises a training system, a storage system, a multi-service gateway, an operation and maintenance management system, a network management system and an algorithm management system. Alternatively, the plurality of second images represent other types of different objects, which is not limited by the embodiment of the present disclosure.
In an embodiment of the present disclosure, the brain diagram includes a plurality of regions, for example, the brain diagram includes regions corresponding to several main portions included in a real brain. In some optional embodiments, the plurality of regions comprised by the first image comprise one or any combination selected from the group consisting of: frontal lobe region, brainstem region, telencephalon region, cerebellum region, occipital lobe region, and medulla oblongata region. Alternatively, the schematic brain diagram may be divided in other manners, which is not limited in the embodiment of the present disclosure.
In the embodiment of the present disclosure, a plurality of regions in the first image have a corresponding relationship with the plurality of second images. In one or more embodiments, the correspondence between the plurality of regions and the plurality of second images is determined based on biological functions to which the plurality of regions respectively correspond in the first image and functions that the subject respectively represented by the plurality of second images has. Taking table 1 as an example, table 1 gives the correspondence between the plurality of regions and the plurality of objects represented by the second image:
frontal lobe area Training system
Brain stem region Storage system
Telencephalon region Multi-service gateway
Cerebellar region Operation and maintenance management system
Region of occipital lobe Network management system
Medulla oblongata region Algorithm management system
TABLE 1
In some alternative embodiments, the plurality of second images are arranged around the first image in the first user interface. Several optional examples are given below, however, the technical solution of the embodiments of the present disclosure is not limited to the following optional examples.
Alternative example one: the plurality of second images are distributed on a closed curve, the closed curve distributed by the plurality of second images and the first image have the same central axis, and the first image is located below the closed curve.
Optional example two: the plurality of second images are distributed on a closed curve, and the first image is located in the center of the closed curve of the distribution of the plurality of second images.
Optional example three: the plurality of second images are distributed on a plurality of parallel planes. Taking 6 second images as an example, 3 second images can be distributed on the first plane, and the other 3 second images are respectively distributed on the second plane, and the first plane and the second plane are both parallel to the plane where the screen is located.
In the above scheme, the closed curve may be a regular closed curve or an irregular closed curve, wherein the regular closed curve is, for example, a circular line or an elliptical line.
Taking the closed curve as a loop line as an example, 1) the first image is located on a central axis of the loop line distributed by the plurality of second images and located below the loop line. Or, 2) the first image is located at the center of a loop line of the distribution of the plurality of second images.
In some optional embodiments, the first user interface further comprises a background image, wherein the background image comprises a star field map.
In some optional embodiments, the second image includes a name and an icon of an object represented by the second image. In one or more embodiments, the plurality of second images includes a background image having hexagons, the bases of the hexagons having a specific pattern, and the specific pattern of the hexagonal bases of different second images are different in color. The hexagonal background image is, for example, a circuit diagram, and the specific image is, for example, a V-shape.
Here, the shape of the second image is not limited to the above-described hexagon, and may be a circle, an ellipse, or the like.
In some alternative embodiments, at least two of the plurality of second images have different sizes or display effects to highlight portions of the second images.
For example: referring to fig. 2, fig. 2 is a schematic diagram of a first user interface, in which a space star is taken as a background in the first user interface shown in fig. 2. The first user interface comprises a brain schematic diagram and 6 second images, wherein the 6 second images are all hexagons, the bottoms of the hexagons are in V-shaped shapes, the colors of the V-shaped shapes at the bottoms of the 6 hexagons are different, and in addition, the 6 second images respectively contain names and icons of 6 objects. The names of the 6 objects in fig. 2 are: the method comprises the following steps of training the name (A) of a system, storing the name (B) of the system, multi-service gateway name (C), operation and maintenance management system name (D), network management system name (E) and algorithm management system name (F). The 6 second images are distributed in a ring shape and rotate around a ring line.
In some optional embodiments, the color of the target area may be changed to a corresponding color of the target area based on detecting a color change triggering event of the target area in the plurality of areas, wherein different areas have different corresponding colors. To indicate this correspondence, in one or more embodiments, the color of the target region corresponds to the same color as the V-shape at the hexagonal base of the target image, considering that the plurality of regions in the first image have a correspondence with the plurality of second images.
In one or more embodiments, the color change triggering event of the target region includes at least one of: clicking the target image, turning the target image to a target position or a position close to the target position, and clicking the target area.
For example, the frontal lobe area corresponds to a training system (called a), and when a color change triggering event is detected, the color of the frontal lobe area is controlled to be consistent with the color of the hexagonal bottom V shape written with the name of a, for example, all of the colors are dark purple.
For example, the brainstem area corresponds to a storage system (called B), and when a color change triggering event is detected, the color of the brainstem area is controlled to be consistent with the color of the hexagonal bottom V-shaped written with the name of B, for example, all the brainstem areas are rose red.
For example, the telencephalon region corresponds to a multi-service gateway system (called C), and when a color change trigger event is detected, the color of the control telencephalon region is consistent with the color of the hexagonal bottom V-shape written with the name of C, for example, all transparent rose.
For example, the cerebellum area corresponds to an operation and maintenance management system (called D), and when a color change trigger event is detected, the color of the cerebellum area is controlled to be consistent with the color of a hexagonal bottom V-shaped area written with a name of D, for example, the cerebellum area is transparent green.
For example, the occipital lobe area corresponds to a network management system (called E), and when a color change triggering event is detected, the color of the occipital lobe area is controlled to be consistent with the color of the hexagonal bottom V-shaped written with the name of E, for example, the occipital lobe area is blue.
For example, the medullary area corresponds to an algorithm management system (called F), and when a color change triggering event is detected, the color of the medullary area is controlled to be consistent with the color of the hexagonal bottom V-shape written with the name of F, for example, all yellow.
In 102, based on a received first user operation for a target area in a plurality of areas of the first image, information display of an object represented by a target image in the first user interface is controlled, wherein the target image included in the plurality of second images corresponds to the target area.
In some alternative embodiments, the first user action may be, but is not limited to being: single-click operation, double-click operation, touch operation, gesture operation, voice control operation and the like.
In some optional embodiments, controlling the information display of the object represented by the target image in the first user interface may be implemented as follows:
1) displaying a target image representing the object at a target location of the first user interface.
Here, the plurality of second images are arranged along a closed curve; the plurality of second images may be controlled to rotate along the closed curve until the target image reaches the target position. Here, the closed curve is, for example, a circular line or an elliptical line.
In one or more implementations, the plurality of second images are rotated along a closed curve in a first direction before receiving a first user operation for a target region of a plurality of regions of the first image. After receiving a first user operation for a target region of a plurality of regions of the first image, controlling the plurality of second images to continue to rotate along the closed curve toward the first direction until the target image reaches the target position. Here, the first direction is, for example, a clockwise direction or a counterclockwise direction.
Here, the projection of the target image displayed at the target position on the screen is located directly above the projection of the first image on the screen. The target position is located in the optimal visual field range of the user, so that the user can conveniently view the target image, for example, the target position is the central position of the first user interface.
For example: the plurality of second images rotate around the circular line at a certain speed, and when the user clicks the area a in the first image, the plurality of second images are integrally and rapidly rotated around the circular line, so that the second image a corresponding to the area a is rapidly rotated to a central position of the first user interface, wherein the central position belongs to one position on the circular line.
For example: the plurality of second images rotate around the loop line at a certain speed, and when the user clicks the second image a, the plurality of second images are integrally fast-rotated around the loop line, so that the second image a is fast-rotated to a central position of the first user interface, where the central position belongs to one position on the loop line.
2) And displaying an introduction page of the object represented by the target image in the first user interface.
Here, the introduction page is displayed in the first user interface in a layer-overlapping manner, where the introduction page is located in a second layer, and the second layer is overlapped on a first layer to which the target object belongs.
In one or more embodiments, the background of the introduction page is translucent or has a preset transparency to make the target image visible.
In one or more embodiments, in the first user interface displayed by overlapping the first layer and the second layer, the target image is located at a center position of the introduction page.
In one or more embodiments, in the first user interface displayed by overlapping the first layer and the second layer, the target image is transparently displayed at a position corresponding to the target image in the introduction page.
3) Displaying a target image representing the object at a target location of the first user interface, and displaying an introduction page of the object represented by the target image in the first user interface.
Here, displaying a target image representing the object at a target position of the first user interface may be as described in the foregoing 1).
Here, an introduction page of the object represented by the target image is displayed in the first user interface, and reference may be made to the description of 2) above.
The following describes the rotation of the target image and the corresponding introduction surface with reference to a specific application example.
Application example one: exhibit A introduction page (introduction page corresponding to training system)
As shown in fig. 3-1, clicking on the frontal lobe region in the brain diagram or clicking on the hexagon written with the a name moves the hexagon written with the a name to the center of the first user interface, while the color of the frontal lobe region is consistent with the color of the bottom V-shape of the hexagon written with the a name, e.g., both are dark purple. Then, as shown in fig. 3-2, the introduction page of the training system a is displayed on the first user interface in an overlaid manner with a certain transparency.
Application example two: show B introduction page (introduction page corresponding to storage system)
As shown in fig. 4-1, clicking on the brainstem region in the brain diagram or clicking on the hexagon written with the B name moves the hexagon written with the B name to the center of the first user interface, while the brainstem region has a color that is consistent with the color of the bottom V-shape of the hexagon written with the B name, e.g., all in rose. Then, as shown in fig. 4-2, the introduction page of the B memory system is displayed superimposed on the first user interface with a certain transparency.
Application example three: show C introduction page (namely introduction page corresponding to multi-service gateway system)
As shown in fig. 5-1, clicking on the brain area in the schematic diagram of the brain or clicking on the hexagon written with the C name moves the hexagon written with the C name to the center of the first user interface, while the color of the brain area is consistent with the color of the bottom V-shape of the hexagon written with the C name, for example, all are transparent rosy. Then, as shown in fig. 5-2, the introduction page of the C multi-service gateway system is displayed on the first user interface in an overlaid manner with a certain transparency.
Application example four: d introduction page is shown (corresponding to the fortune dimension management system)
As shown in fig. 6-1, clicking on the cerebellar region in the brain diagram or clicking on the hexagon written with the D name moves the hexagon written with the D name to the center of the first user interface, while the color of the cerebellar region is consistent with the color of the bottom V-shape of the hexagon written with the D name, e.g., both are transparent green. Then, as shown in fig. 6-2, an introduction page of the D operation and maintenance management system is displayed on the first user interface in an overlapping manner according to a certain transparency.
Application example five: e introduction page is shown (namely introduction page corresponding to network management system)
As shown in fig. 7-1, clicking on the occipital region in the brain diagram or clicking on the hexagon written with the E name moves the hexagon written with the E name to the center of the first user interface while the occipital region is in a color that is consistent with the color of the bottom V-shape of the hexagon written with the E name, e.g., both are blue. Then, as shown in fig. 7-2, the introduction page of the E-network management system is displayed on the first user interface in an overlapping manner according to a certain transparency.
Application example six: showing F introduction page (namely introduction page corresponding to algorithm management system)
As shown in fig. 8-1, clicking on the medullary area in the brain diagram or clicking on the hexagon written with the F name moves the hexagon written with the F name to the center of the first user interface, while the medullary area is in a color that coincides with the color of the bottom V-shape of the hexagon written with the F name, e.g., both yellow. Then, as shown in fig. 8-2, the introduction page of the F algorithm management system is displayed superimposed on the first user interface with a certain transparency.
In some alternative embodiments, the training speed of the computing platform is also displayed on the first user interface.
In some optional embodiments, before displaying the first user interface, a second user interface is displayed, wherein the second user interface comprises a second brain diagram, wherein the size of the second brain diagram is larger than that of the first brain diagram. Then, the second user interface is converted into the first user interface. Here, the second user interface may be automatically converted into the first user interface, or the second user interface may be converted into the first user interface based on a second user operation, referring to fig. 8-3.
In one or more embodiments, the converting the second user interface to the first user interface includes: and carrying out reduction processing on the second brain schematic diagram to obtain the first brain schematic diagram, and displaying the plurality of second images. Further, the position of the brain diagram is controlled to move while the second brain diagram is reduced, so that the reduced first brain diagram is located at a specific position (such as a position lower than the center) of the screen.
Fig. 9 is a schematic structural component view of an interface display device provided in an embodiment of the present disclosure, and as shown in fig. 9, the interface display device includes:
a display unit 901, configured to display a first user interface, where the first user interface includes a first image and a plurality of second images, the first image includes a first brain diagram, and the first image includes a plurality of regions corresponding to the plurality of second images;
an interaction unit 902, configured to receive a first user operation for a target region in a plurality of regions of the first image;
the display unit 901 is configured to control information display of an object represented by a target image in the first user interface based on a received first user operation on a target area in a plurality of areas of the first image, where the target image included in the plurality of second images corresponds to the target area.
In one or more embodiments, the display unit 901 is configured to display a target image representing the object at a target position of the first user interface based on a received first user operation for a target area of the plurality of areas of the first image.
In one or more embodiments, the projection of the target image displayed at the target location on the screen is directly above the projection of the first image on the screen.
In one or more embodiments, the plurality of second images are arranged along a closed curve; the display unit 901 is configured to control the plurality of second images to rotate along the closed curve until the target image reaches the target position.
In one or more embodiments, the interaction unit 902 rotates the plurality of second images along the closed curve in the first direction before receiving the first user operation for the target area in the plurality of areas of the first image, and the display unit 901 controls the plurality of second images to continue rotating along the closed curve in the first direction until the target image reaches the target position after the interaction unit 902 receives the first user operation for the target area in the plurality of areas of the first image.
In one or more embodiments, the display unit 901 is configured to display, in the first user interface, an introduction page of an object represented by a target image based on a received first user operation on the target area in the plurality of areas of the first image.
In one or more embodiments, the display unit 901 is configured to display the introduction page in the first user interface in a manner of overlaying layers, where the introduction page is located in a second layer, and the second layer is overlaid on a first layer to which the target object belongs.
In one or more embodiments, the background of the introduction page is translucent or has a preset transparency to make the target image visible.
In one or more embodiments, in the first user interface displayed by overlapping the first layer and the second layer, the target image is located at a center position of the introduction page.
In one or more embodiments, the plurality of second images are arranged around the first image.
In one or more embodiments, the plurality of second images are distributed with a closed curve having the same central axis as the first image, and the first image is located below the closed curve.
In one or more embodiments, the first image is located at the center of a closed curve of the distribution of the plurality of second images.
In one or more embodiments, the second image includes a name and an icon of an object represented by the second image.
In one or more embodiments, at least two of the plurality of second images have different sizes or display effects to highlight portions of the second images.
In one or more embodiments, the plurality of second images includes a background image having hexagons, the bases of the hexagons having a specific pattern, and the specific pattern of the hexagonal bases of different second images are different in color.
In one or more embodiments, the plurality of second image represented objects includes at least one of:
the system comprises a training system, a storage system, a multi-service gateway, an operation and maintenance management system, a network management system and an algorithm management system.
In one or more embodiments, the plurality of regions includes at least one of: frontal lobe region, brainstem region, telencephalon region, cerebellum region, occipital lobe region, and medulla oblongata region.
In one or more embodiments, the correspondence between the plurality of regions and the plurality of second images is determined based on biological functions to which the plurality of regions respectively correspond in the first image and functions that the subject respectively represented by the plurality of second images has.
In one or more implementations, the first user interface further includes a background image, wherein the background image includes a star field map.
In one or more embodiments, the display unit 901 is configured to display a second user interface before displaying the first user interface, where the second user interface includes a second brain diagram, and a size of the second brain diagram is larger than a size of the first brain diagram; converting the second user interface to the first user interface.
In one or more embodiments, the display unit 901 is configured to perform reduction processing on the second brain diagram to obtain the first brain diagram, and display the plurality of second images.
In one or more embodiments, the display unit 901 is configured to change a color of a target area in the plurality of areas to a corresponding color of the target area in response to detecting a color change triggering event of the target area, where different areas have different corresponding colors.
In one or more embodiments, the color change triggering event of the target region includes at least one of:
clicking the target image, turning the target image to a target position or a position close to the target position, and clicking the target area.
It should be understood by those skilled in the art that the functions of the units in the interface display apparatus shown in fig. 9 can be understood by referring to the related description of the interface display method. The functions of the units in the interface display apparatus shown in fig. 9 may be implemented by a program running on a processor, or may be implemented by specific logic circuits.
The disclosed embodiment also provides a display device, which can be a common display screen, an all-in-one machine, a projector, a VR device, an AR device and the like. The display device displays a first user interface, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain diagram, and the second images rotate around.
In one or more embodiments, the plurality of second images are arranged along a closed curve along which the plurality of second images are rotated.
In one or more embodiments, a target image of the plurality of second images is rotated to a target position in response to a first user operation with respect to a target region of a plurality of regions of the first image.
In one or more embodiments, the projection of the target image displayed at the target location on the screen is directly above the projection of the first image on the screen.
In one or more embodiments, in response to a first user operation with respect to a target region of the plurality of regions of the first image, an introduction page of an object represented by the target image is displayed in the first user interface.
In one or more embodiments, the introduction page is displayed in the first user interface in a layer-overlapping manner, where the introduction page is located in a second layer, and the second layer is stacked on a first layer to which the target object belongs.
In one or more embodiments, the background of the introduction page is translucent or has a preset transparency.
In one or more embodiments, in the first user interface displayed by overlapping the first layer and the second layer, the target image is located at a center position of the introduction page.
In one or more embodiments, the plurality of second images are arranged around the first image.
In one or more embodiments, the plurality of second images are distributed with a closed curve having the same central axis as the first image, and the first image is located below the closed curve.
In one or more embodiments, the first image is located at the center of a closed curve of the distribution of the plurality of second images.
In one or more embodiments, the second image includes a name and an icon of an object represented by the second image.
In one or more embodiments, at least two of the plurality of second images have different sizes or display effects.
In one or more embodiments, the plurality of second images includes a background image having hexagons, the bases of the hexagons having a specific pattern, and the specific pattern of the hexagonal bases of different second images are different in color.
In one or more implementations, the first user interface further includes a background image, wherein the background image includes a star field map.
In one or more embodiments, the display device displays a second user interface, wherein the second user interface includes a second brain diagram, wherein a size of the second brain diagram is larger than a size of the first brain diagram.
In one or more embodiments, in response to detecting a color change triggering event of a target area of the plurality of areas, the color of the target area of the plurality of second images changes to a corresponding color of the target area, wherein different areas have different corresponding colors.
The interface display device according to the embodiment of the present invention may also be stored in a computer-readable storage medium if it is implemented in the form of a software function module and sold or used as an independent product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for enabling an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
Accordingly, the embodiment of the present invention further provides a computer program product, in which computer-executable instructions are stored, and when the computer-executable instructions are executed, the interface display method of the embodiment of the present invention can be implemented.
Fig. 10 is a schematic structural component diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 10, the electronic device 100 may include one or more processors 1002 (only one of which is shown in the figure) (the processors 1002 may include, but are not limited to, a processing device such as a Microprocessor (MCU) or a Programmable logic device (FPGA)), a memory 1004 for storing data, and a transmission device 1006 for a communication function. It will be understood by those skilled in the art that the structure shown in fig. 10 is merely illustrative and is not intended to limit the structure of the electronic device. For example, electronic device 100 may also include more or fewer components than shown in FIG. 10, or have a different configuration than shown in FIG. 10.
The memory 1004 can be used for storing software programs and modules of application software, such as program instructions/modules corresponding to the method in the embodiment of the present invention, and the processor 1002 executes various functional applications and data processing by running the software programs and modules stored in the memory 1004, so as to implement the method described above. The memory 1004 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 1004 may further include memory located remotely from the processor 1002, which may be connected to the electronic device 100 through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 1006 is used for receiving or sending data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the electronic device 100. In one example, the transmission device 1006 includes a Network adapter (NIC) that can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 1006 can be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
The technical schemes described in the embodiments of the present invention can be combined arbitrarily without conflict.
In the embodiments provided in the present invention, it should be understood that the disclosed method and intelligent device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one second processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (10)

1. An interface display method, comprising:
displaying a first user interface, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain diagram, and the first image comprises a plurality of areas corresponding to the plurality of second images;
controlling information display of an object represented by a target image in the first user interface based on a received first user operation for a target area in a plurality of areas of the first image, wherein the target area corresponds to the target area and the target image is included in the plurality of second images.
2. The method of claim 1, wherein controlling display of information in the first user interface for an object represented by a target image based on a received first user operation for a target region of the plurality of regions of the first image comprises:
displaying a target image representing the object at a target location of the first user interface based on a received first user operation for a target region of the plurality of regions of the first image.
3. The method of claim 2, wherein the projection of the target image displayed at the target location on the screen is directly above the projection of the first image on the screen.
4. The method of claim 2 or 3, wherein the plurality of second images are arranged along a closed curve;
the displaying a target image representing the object at a target location of the first user interface includes:
controlling the plurality of second images to rotate along the closed curve until the target image reaches the target position.
5. The method of claim 4, wherein, prior to receiving a first user operation with respect to a target region of the plurality of regions of the first image, the plurality of second images are rotated along a closed curve in a first direction, the controlling the plurality of second images to be rotated along the closed curve until the target image reaches the target position comprises:
controlling the plurality of second images to continue to rotate along the closed curve in the first direction until the target image reaches the target position.
6. An interface display apparatus, the apparatus comprising:
the display unit is used for displaying a first user interface, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain schematic diagram, and the first image comprises a plurality of areas corresponding to the plurality of second images;
an interaction unit, configured to receive a first user operation for a target region in a plurality of regions of the first image;
the display unit is used for controlling information display of an object represented by a target image in the first user interface based on received first user operation aiming at the target area in the plurality of areas of the first image, wherein the target image included in the plurality of second images corresponds to the target area.
7. The display device is characterized in that a first user interface is displayed on the display device, wherein the first user interface comprises a first image and a plurality of second images, the first image comprises a first brain schematic diagram, and the second images rotate around.
8. A computer program product, characterized in that it comprises computer-executable instructions capable, when executed, of implementing the method steps of any one of claims 1 to 5.
9. A storage medium having stored thereon executable instructions which, when executed by a processor, carry out the method steps of any of claims 1 to 5.
10. An electronic device, comprising a memory having computer-executable instructions stored thereon and a processor, wherein the processor, when executing the computer-executable instructions on the memory, is configured to perform the method steps of any of claims 1 to 5.
CN201910408195.6A 2019-05-15 2019-05-15 Interface display method and device and electronic equipment Pending CN111949343A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910408195.6A CN111949343A (en) 2019-05-15 2019-05-15 Interface display method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910408195.6A CN111949343A (en) 2019-05-15 2019-05-15 Interface display method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111949343A true CN111949343A (en) 2020-11-17

Family

ID=73336001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910408195.6A Pending CN111949343A (en) 2019-05-15 2019-05-15 Interface display method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111949343A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101751906A (en) * 2008-12-17 2010-06-23 三星电子株式会社 Display method and photographing apparatus and display apparatus using the same
US20110302529A1 (en) * 2010-06-08 2011-12-08 Sony Corporation Display control apparatus, display control method, display control program, and recording medium storing the display control program
US20150046875A1 (en) * 2013-08-07 2015-02-12 Ut-Battelle, Llc High-efficacy capturing and modeling of human perceptual similarity opinions
US20170038926A1 (en) * 2008-10-22 2017-02-09 D.R. Systems, Inc. Pressure sensitive manipulation of medical image data
KR101772158B1 (en) * 2016-03-08 2017-08-28 삼성전자주식회사 Device and method thereof for displaying image
CN107801075A (en) * 2016-08-30 2018-03-13 三星电子株式会社 Image display and its operating method
CN108024127A (en) * 2016-10-28 2018-05-11 三星电子株式会社 Image display device, mobile equipment and its operating method
CN108205431A (en) * 2016-12-16 2018-06-26 三星电子株式会社 Show equipment and its control method
CN109196860A (en) * 2016-07-30 2019-01-11 华为技术有限公司 A kind of control method and relevant apparatus of multi-view image
CN109343782A (en) * 2018-08-02 2019-02-15 维沃移动通信有限公司 A kind of display methods and terminal

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170038926A1 (en) * 2008-10-22 2017-02-09 D.R. Systems, Inc. Pressure sensitive manipulation of medical image data
CN101751906A (en) * 2008-12-17 2010-06-23 三星电子株式会社 Display method and photographing apparatus and display apparatus using the same
US20110302529A1 (en) * 2010-06-08 2011-12-08 Sony Corporation Display control apparatus, display control method, display control program, and recording medium storing the display control program
US20150046875A1 (en) * 2013-08-07 2015-02-12 Ut-Battelle, Llc High-efficacy capturing and modeling of human perceptual similarity opinions
KR101772158B1 (en) * 2016-03-08 2017-08-28 삼성전자주식회사 Device and method thereof for displaying image
CN109196860A (en) * 2016-07-30 2019-01-11 华为技术有限公司 A kind of control method and relevant apparatus of multi-view image
CN107801075A (en) * 2016-08-30 2018-03-13 三星电子株式会社 Image display and its operating method
CN108024127A (en) * 2016-10-28 2018-05-11 三星电子株式会社 Image display device, mobile equipment and its operating method
CN108205431A (en) * 2016-12-16 2018-06-26 三星电子株式会社 Show equipment and its control method
CN109343782A (en) * 2018-08-02 2019-02-15 维沃移动通信有限公司 A kind of display methods and terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘佰鑫;刘成良;贡亮;: "基于触屏设备的人机交互界面设计", 机电一体化, no. 04, 15 April 2015 (2015-04-15) *

Similar Documents

Publication Publication Date Title
JP6515202B2 (en) Icon display method and device
CN107957831B (en) Data processing method, device and processing equipment for displaying interface content
TW202219704A (en) Dynamic configuration of user interface layouts and inputs for extended reality systems
US20200264694A1 (en) Screen control method and device for virtual reality service
CN107533661A (en) Electronic installation including coil
CN109939440A (en) Generation method, device, processor and the terminal of 3d gaming map
KR102593344B1 (en) Electronic device, wearable device and method for controlling a display in the electronic device
CN105900051A (en) Electronic device and method for displaying event in virtual reality mode
CN109771947A (en) Costume changing method, device, computer storage medium and the electronic equipment of game role
CN106575160A (en) Method and apparatus for providing interface recognizing movement in accordance with user's view
CN105657638A (en) Method and device for function sharing between electronic devices
CN109117779A (en) One kind, which is worn, takes recommended method, device and electronic equipment
EP2775664A1 (en) Resource information display method and apparatus
TWI750561B (en) Electronic devices with display burn-in mitigation
CN107491177A (en) Electronic equipment for the method for the rotation that identifies rotary body and for handling this method
CN111767817B (en) Dress collocation method and device, electronic equipment and storage medium
CN109462628A (en) The customized setting method and system, Cloud Server and shared automobile of shared automobile
CN110476178A (en) The providing method and device of the recommendation information of article
CN105094615B (en) A kind of information processing method and electronic equipment
CN110120087A (en) The label for labelling method, apparatus and terminal device of three-dimensional sand table
CN104090706B (en) Content acquisition method, content share method and its device
Qureshi et al. Fully integrated data communication framework by using visualization augmented reality for internet of things networks
CN108363574A (en) Front end method for customizing, device, terminal device and storage medium based on SDK
CN111949343A (en) Interface display method and device and electronic equipment
CN111708476B (en) Virtual keyboard display method, virtual keyboard and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination