CN112308757A - Data display method and mobile terminal - Google Patents

Data display method and mobile terminal Download PDF

Info

Publication number
CN112308757A
CN112308757A CN202011116845.9A CN202011116845A CN112308757A CN 112308757 A CN112308757 A CN 112308757A CN 202011116845 A CN202011116845 A CN 202011116845A CN 112308757 A CN112308757 A CN 112308757A
Authority
CN
China
Prior art keywords
image
processing unit
window
data
target navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011116845.9A
Other languages
Chinese (zh)
Other versions
CN112308757B (en
Inventor
张凯
罗伦文
谭军胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Zhongke Tongda High New Technology Co Ltd
Original Assignee
Wuhan Zhongke Tongda High New Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Zhongke Tongda High New Technology Co Ltd filed Critical Wuhan Zhongke Tongda High New Technology Co Ltd
Priority to CN202011116845.9A priority Critical patent/CN112308757B/en
Publication of CN112308757A publication Critical patent/CN112308757A/en
Application granted granted Critical
Publication of CN112308757B publication Critical patent/CN112308757B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application provides a data display method and a mobile terminal, and relates to the technical field of smart cities. The method comprises the following steps: the CPU determines the size and the position of a first window and a second window in a data display interface according to historical operation data; the CPU copies the image data to obtain first image data and second image data, and transmits the first image data and the second image data to the first processing unit and the second processing unit respectively; the first processing unit renders the first image data into a first image under a large visual angle according to the first image data, and displays the first image in a first window; the second processing unit generates a second image under a small visual angle according to the second projection matrix and the second image data, and displays the second image in a second window; and the third processing unit determines and processes a target navigation area of the second image in the first image model corresponding to the first image according to the second projection matrix so as to obtain and highlight a target navigation image. According to the embodiment of the application, the operation efficiency of the user is improved, the understanding efficiency of the image data is improved, and the power consumption of the mobile terminal is reduced.

Description

Data display method and mobile terminal
Technical Field
The application relates to the technical field of smart cities, in particular to a data display method and a mobile terminal.
Background
In traditional video monitoring, 2D plane pictures are mainly displayed, but with the rise of computer technology, the advantages of fisheye images in the monitoring industry are more and more obvious. The scene of only a position can be monitored in traditional plane camera, but the fish-eye camera can monitor a wider visual field because of having a wider visual angle, so that the field needing monitoring by a plurality of plane cameras originally can be solved by only one fish-eye camera, and the hardware cost is greatly saved.
Because the fisheye camera has wider visual angle, the fisheye image (image data) obtained by shooting often has great distortion, and the fisheye image obtained by shooting is usually displayed through a circle, so that the fisheye image is not well understood and can be understood by professional technicians, and the application of the fisheye image cannot be well popularized and developed.
Disclosure of Invention
The embodiment of the application provides a data display method and a mobile terminal, which can improve the understanding efficiency of image data acquired by a fisheye camera, reduce the power consumption of the mobile terminal and improve the user experience.
The embodiment of the application provides a data display method, which is suitable for a mobile terminal, wherein the mobile terminal comprises a central processing unit, a memory, a first processing unit, a second processing unit and a third processing unit, wherein the first processing unit, the second processing unit and the third processing unit are operated on a graphic processor; the data display method comprises the following steps:
the central processing unit determines the size and the position of a first window and a second window in the data display interface according to historical operation data;
the central processing unit copies image data to obtain first image data and second image data, and transmits the first image data to the first processing unit and transmits the second image data to the second processing unit;
the first processing unit renders the first image data into a first image under a large visual angle, and displays the first image in a first window of a data display interface;
the central processing unit determines a second projection matrix according to control operation of a user on a second window of the data display interface, and transmits the second projection matrix to the second processing unit and the third processing unit;
the second processing unit generates a second image under a small visual angle according to the second projection matrix and the second image data, and displays the second image in the second window;
the third processing unit determines a target navigation image according to the second projection matrix so as to display the target navigation image in the first window in a protruding mode, wherein the target navigation image represents position information of the second image under a small visual angle in the first image under a large visual angle.
An embodiment of the present application further provides a mobile terminal, where the mobile terminal includes: one or more central processors; a memory; one or more graphics processors, and one or more computer programs, wherein the central processor is connected to the memory and the graphics processors, the one or more computer programs being stored in the memory and configured to be executed by the central processor and the graphics processors to perform the data presentation method.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the data presentation method are implemented.
The method comprises the steps that image data are processed and displayed on a first window and a second window of a data display interface through a central processing unit of a mobile terminal and a processing unit running on a graphic processor, specifically, the central processing unit determines the size and the position of the first window and the second window in the data display interface according to historical operation data, the size and the position of the first window and the position of the second window in the data display interface are determined according to the historical operation data, and the size and the position of the first window and the second window determined take the historical operation data of a user into account, so that the size and the position of the first window and the second window are determined according with operation habits of the user, and the operation efficiency of the user is improved; copying the image data to obtain first image data and second image data, transmitting the first image data to a first processing unit, and transmitting the second image data to a second processing unit; the first processing unit renders the first image data into a first image under a large viewing angle, and displays the first image in a first window; the central processing unit determines a second projection matrix according to the control operation of a user on the second window, and transmits the second projection matrix to the second processing unit and the third processing unit, the second processing unit generates a second image under a small visual angle according to the second projection matrix and the second image data, and displays the second image in the second window, so that the image data is processed by using the graphic processor, the power consumption of the mobile terminal is reduced, and the efficiency of processing the image data is improved; images under different viewing angles are obtained after processing, and the understanding efficiency of the image data content is improved; the third processing unit determines the target navigation image according to the second projection matrix to display the target navigation image in the first window in a protruding manner, on one hand, the image processor is used for processing, the efficiency of processing the image data is improved, and the power consumption of the mobile terminal is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a schematic view of a data presentation system provided in an embodiment of the present application;
fig. 1b is a schematic structural diagram of a mobile terminal provided in an embodiment of the present application;
FIG. 2 is a schematic flow chart diagram illustrating a data presentation method according to an embodiment of the present disclosure;
FIG. 3 is a sub-flow diagram of an image data presentation method according to an embodiment of the present application;
FIGS. 4 a-4 b are schematic diagrams of an initial interface provided by an embodiment of the present application;
FIG. 5a is a schematic diagram illustrating an interface operation effect of an initial interface provided in an embodiment of the present application;
FIG. 5b is a schematic diagram illustrating a partitioning effect of an initial interface according to an embodiment of the present disclosure;
FIG. 6 is a schematic representation of an embodiment of the present application after an initial interface fill;
fig. 7 is a schematic diagram of image data acquired by a fisheye camera provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of an imaging principle of perspective projection provided by an embodiment of the present application;
FIG. 9 is a schematic view of a large-view scene provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a data presentation interface presentation image provided by an embodiment of the present application;
fig. 11 is a scene schematic diagram of a small viewing angle provided by an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a data display method, a mobile terminal and a storage medium. The mobile terminal includes, but is not limited to, a smart phone, a tablet computer, a notebook computer, a smart robot, a wearable device, a vehicle-mounted terminal, and the like.
Please refer to fig. 1a, which is a schematic view illustrating a data display system according to an embodiment of the present disclosure. The data display system comprises a fisheye camera and a mobile terminal. The number of the fisheye cameras can be one or more, the number of the mobile terminals can also be one or more, and the fisheye cameras and the mobile terminals can be directly connected or can be connected through a network. The fisheye camera in the embodiment of fig. 1a is connected to the mobile terminal through a network, where the network includes network entities such as a router and a gateway.
The fisheye camera can shoot to obtain initial image data, wherein the initial image data refers to an image shot by the fisheye camera, and the shot initial image data is sent to the mobile terminal; the mobile terminal receives initial image data shot by the fisheye camera and stores the initial image data in the memory. In one case, the initial image data is directly used as the image data collected by the fisheye camera, the image data is received and stored in the memory, and in the other case, the initial image data is corrected to obtain the image data collected by the fisheye camera and stored in the memory. And finally, correspondingly processing the image data through the processor and the image processor and displaying the data. The purpose of the correction process is to reduce or eliminate distortion in the original image data, among other things.
Specifically, the processor 101 is included in the mobile terminal, and the processor 101 is a control center of the mobile terminal. The processor 101 includes one or more Central Processing Units (CPUs) and at least one Graphics Processing Unit (GPU). At least one graphics processor is connected to the central processor. The graphics processor comprises a first processing unit, a second processing unit and a third processing unit. Also included in the plurality of mobile terminals is memory 102 of one or more computer-readable storage media, the memory 102 being connected to the central processor. It should be noted that, in the embodiment of the present application, the first processing unit, the second processing unit, and the third processing unit may be three same or different module units running on one graphics processor, or may be different module units running on at least two graphics processors. If the first processing unit, the second processing unit, and the third processing unit are different module units running on at least two graphics processors (e.g., the first processing unit runs on one graphics processor, the second processing unit runs on another graphics processor, the third processing unit runs on another graphics processor, etc.), it may be understood that the hardware of the mobile terminal is improved, at least two graphics processors are provided (in the prior art, the mobile terminal has either no graphics processor or one graphics processor), and the at least two graphics processors can execute in parallel, so as to greatly improve the efficiency of data processing.
The cpu connects various parts of the entire mobile terminal through various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs (computer programs) and/or modules stored in the memory 102 and calling data, such as image data, stored in the memory 102, thereby integrally monitoring the mobile terminal. Optionally, the central processor may include one or more processing cores; preferably, the central processor may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the central processor. The graphic processor is mainly used for accelerating the data transmitted by the central processing unit, such as rendering and the like.
The memory 102 may be used to store software programs (computer programs) and modules, and the processor 101 executes various functional applications and data processing by operating the software programs and modules stored in the memory 102. The memory 102 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the mobile terminal, image data collected by the fisheye camera, and the like. Further, the memory 102 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 102 may also include a memory controller to provide the processor 101 access to the memory 102.
As shown in fig. 1b, the mobile terminal may further include, in addition to the processor 101 and the memory 102: a Radio Frequency (RF) circuit 103, a power supply 104, an input unit 105, and a display unit 106. Those skilled in the art will appreciate that the mobile terminal architecture shown in the figures is not intended to be limiting of mobile terminals and may include more or fewer components than those shown, or some of the components may be combined, or a different arrangement of components. Wherein:
the RF circuit 103 may be used for receiving and transmitting signals during information transmission and reception, and in particular, for receiving downlink information of a base station and then processing the received downlink information by one or more processors 101; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuitry 103 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 103 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The mobile terminal further includes a power supply 104 (e.g., a battery) for supplying power to the various components, and preferably, the power supply 104 is logically connected to the processor 101 via a power management system, so that functions of managing charging, discharging, and power consumption are implemented via the power management system. The power supply 104 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The mobile terminal may further include an input unit 105, and the input unit 105 may be used to receive input numeric or character information and generate a keyboard, mouse, joystick, optical or trackball signal input in relation to user settings and function control. Specifically, in one particular embodiment, the input unit 105 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 101, and can receive and execute commands sent by the processor 101. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 105 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The mobile terminal may also include a display unit 106, and the display unit 106 may be used to display information input by the user or provided to the user, as well as various graphical user interfaces of the mobile terminal, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 106 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may cover the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 101 to determine the type of the touch event, and then the processor 101 provides a corresponding visual output on the display panel according to the type of the touch event. Although in the figures the touch sensitive surface and the display panel are shown as two separate components to implement input and output functions, in some embodiments the touch sensitive surface may be integrated with the display panel to implement input and output functions.
Although not shown, the mobile terminal may further include a camera (note that the camera here is different from a virtual camera described below, and the camera here refers to hardware), a bluetooth module, and the like, which are not described herein again. Specifically, in this embodiment, the processor 101 in the mobile terminal loads the executable file corresponding to the process of one or more computer programs into the memory 102 according to the corresponding instructions, and the processor 101 runs the computer program stored in the memory 102, thereby implementing the steps in any data presentation method described below. Therefore, the beneficial effects that can be achieved by any data presentation method described below can also be achieved, and specific reference is made to the corresponding description of the data presentation method below.
The first image and the second image in the embodiment of the present application may be images at different viewing angles obtained by processing image data acquired by a common camera (various plane cameras, etc.), or may be images at different viewing angles obtained by processing image data acquired by a fisheye camera. Because the image data acquired by the fisheye camera is not easy to understand, the embodiment of the application will be described by taking the processing of the image data acquired by the fisheye camera as an example; the processing of the image data collected by the common camera is consistent and will not be described again.
Fig. 2 is a schematic flow chart of a data display method according to an embodiment of the present application. The data display method is operated in the mobile terminal, and comprises the following steps:
and 201, the central processor determines the size and the position of a first window and a second window in the data display interface according to historical operation data.
The historical operation data may be historical operation data of a user on the data display interface, or historical operation data of the user on an initial interface corresponding to the data display interface. In order to better determine the operation habits of the user and reduce the influence of the existing control on the determined operation habits of the user on the data display interface, the historical operation data on the initial interface corresponding to the data display interface is taken as a standard, and the historical operation data on the initial interface is taken as an example for explanation in the embodiment of the application.
The size and the position of the first window and the second window in the data display interface can be determined according to the historical operation data, and understandably, the size and the position of the first window and the second window which are determined take the historical operation data of the user into consideration, so that the size and the position of the first window and the second window are determined according with the operation habit of the user, and the operation efficiency of the user on the data display interface is improved.
In one embodiment, the step of determining the size and position of the first window and the second window in the data presentation interface based on historical operating data includes steps 301-305, as shown in FIG. 3.
301, generating an initial interface corresponding to the data display interface.
The initial interface is the same size interface as the data presentation interface.
Specifically, the step of generating an initial interface corresponding to the data display interface includes: acquiring the display size of a data display interface; determining the display size of the initial interface according to the display size of the data display interface; and generating an initial interface of the data display interface according to the display size of the initial interface.
In some cases, when the data presentation interface has an invalid operation area, specifically, the step of generating an initial interface corresponding to the data presentation interface includes: acquiring the size and the position of an effective operation area in a data display interface and the display size of the data display interface; determining the display size of the initial interface according to the display size of the data display interface; generating an initial interface of the data display interface according to the display size of the initial interface; and determining the size and the position of an effective acquisition area for acquiring historical operation data in the initial interface according to the size and the position of the effective operation area. The invalid operation area refers to an area on the data presentation interface that does not respond to an operation even if the operation is performed.
Wherein, the generated initial interface may be a blank interface, as shown in fig. 4 a; or the generated initial interface is an interface including the same operation unit areas uniformly distributed, as shown in fig. 4 b. Here, the unit area refers to a minimum area that can be operated in the initial interface (UI).
And 302, receiving historical operation data of a current user through an initial interface.
The user can operate on the initial interface, and historical operation data of the current user is received through the initial interface. The historical operation data may be operation data within a preset time.
303, dividing the initial interface into at least two display areas with different operation heat degrees according to the triggered historical operation data in each unit area in the initial interface.
Each unit area in the initial interface may be triggered by a user's finger or by an input device having a pointer, such as a mouse or a laser pointer. The operation of triggering each unit area may be at least one of a click operation, a double click operation, and a long press operation. After the user operates each unit area in the initial interface, the mobile terminal can integrate and analyze all operation data to obtain historical operation data. For example, an initial interface displayed on a screen of the mobile terminal is a blank interface, after a user performs a click operation in any area of the blank interface by using a mouse, the mobile terminal may analyze, based on the click operation, to obtain corresponding operation data, such as an operation position, and if the user performs the operation on the blank interface continuously or intermittently within a preset time, the mobile terminal may obtain historical operation data, such as data including operation times, operation position, and the like, triggered by the initial interface within the preset time.
And after obtaining the historical operation data, dividing the initial interface into at least two display areas with different operation heat degrees. The display area refers to a relatively independent interface sub-area in the display interface; the operation heat level refers to a degree to which the display region is operated, and the operation heat levels of different display regions may be calculated by the number of operations operated per unit time or by the ratio of the number of operations, for example, 20% or 80%.
Dividing the display area according to the operation position in the historical operation data and the number of the windows of the display window which needs to be independently displayed, and determining the operation heat of the display area according to the operation times in the historical operation data. As shown in fig. 5a, the operation position triggered by the initial interface is represented as a position of each dot, and the position may be converted into coordinate information on the initial interface, so that the position of each dot is an operation coordinate of the historical operation. Assuming that the number of the windows of the display window to be displayed is three, the initial interface is divided into 3 display areas, and the positions and the sizes of the 3 display areas are determined by the operation positions in the historical operation data. The number of operations the initial interface is triggered to is embodied as the total number of dots. As shown in fig. 5b, the initial interface is divided into 3 independently displayed display areas. Wherein, the area (1) comprises 7 triggered dots, the area (2) comprises 3 triggered dots, the area (3) comprises 1 triggered dot, and the corresponding operation heat degree size relationship is as follows: region (1) is larger than region (2), region (2) > region (3).
304, according to the historical operation frequency data of each display window in the image data display function, marking each display window as a display window with different operation demand degrees, wherein the display windows comprise a first window and a second window.
The historical operating frequency data refers to the frequency of operating each display window within a preset time period, for example, the number of uses within the past 24 hours is 5. The operation requirement degree refers to the degree to which each presentation window needs to be operated, and the operation requirement degrees of different presentation windows may be calculated according to the number of operations operated in unit time, or may be calculated according to the ratio of the number of operations, for example, 20%, 80%, or the like. The display window comprises a first window and a second window.
The mobile terminal obtains historical operation frequency data of each display window in image data display, each display window is marked as a display window with different operation demand degrees according to the historical operation frequency data, the operation demand degree is determined according to the historical operation frequency data of each display window, the larger the numerical value in the historical operation frequency data is, the higher the operation demand degree is, the smaller the numerical value in the historical operation frequency data is, and the lower the operation demand degree is. According to historical operation frequency data, each display window is marked with a dedicated operation demand degree, so that the display area of each display window in the display interface can be determined according to the operation demand degree, and the display window is filled into the display area to which the display window belongs, and the corresponding data display interface is obtained.
The number of presentation windows as independently displayed is two: the window comprises a first window and a second window, wherein the operation demand degree of the first window is higher than that of the second window. The number of presentation windows as independently displayed is three: the window comprises a first window and two second windows, wherein the two second windows are a first window and a second window respectively, the operation requirement degree of the first window is higher than that of the second window, and the operation requirement degree of the first window is higher than that of the second window. It should be noted that although there are two second windows, the two second windows are independent, so the number of presentation windows that are substantially independently displayed is three. Both of the second windows are hereinafter used to display a second image at a small viewing angle.
And 305, displaying a display window corresponding to the operation demand degree in the display areas corresponding to the initial interface and different operation heat degrees according to the corresponding relation between the operation heat degree and the operation demand degree to obtain a data display interface.
And presetting the corresponding relation between the operation heat degree and the operation demand degree, and acquiring the preset corresponding relation. In the embodiment of the present application, the correspondence between the operation heat degree and the operation demand degree is a correspondence between a high operation heat degree and a high operation demand degree, and a correspondence between a low operation heat degree and a low operation demand degree. And filling display windows corresponding to the operation demand degree in different display areas corresponding to the initial interface, and displaying the corresponding display windows to obtain a data display interface. It should be noted that no image is displayed on the data display interface.
Please refer to fig. 6, which is a schematic interface diagram of filling different display windows in different display areas according to an embodiment of the present application. The number of the display areas in the initial interface is three, and the operation heat degree size relation of the three display areas is as follows: region (1) is larger than region (2), region (2) > region (3). Meanwhile, if the size relationship of the operation requirement degree of each display window is as follows: the first window is larger than the first second window, and the first second window is larger than the second window. And filling the first window into the region (1), filling the first window into the region (2) and filling the second window into the region (3) according to the corresponding relation between the operation heat degree and the operation demand degree. The interface shown in fig. 6 is the corresponding display interface after the initial interface is filled. In the embodiment of the present application, a control operation on the second window is referred to, and it should be noted that a control operation may also be performed on the first window.
According to the corresponding relation between the operation heat and the operation demand, displaying windows corresponding to the operation demand in display areas corresponding to the initial interface and different operation heat, and determining the displaying windows by considering the operation heat and the operation demand, so that the displaying windows conform to the operation habits of users, and the interface operation efficiency is improved.
202, the central processing unit copies the image data to obtain a first image data and a second image data, and transmits the first image data to the first processing unit and transmits the second image data to the second processing unit.
The central processing unit firstly obtains the image data from the memory, and copies the image data to two parts after obtaining the image data, so as to respectively obtain the first image data and the second image data. Transmitting the first image data to a first processing unit so that the first processing unit processes the first image data; the second image data is transmitted to the second image processor so that the second processing unit processes the second image data.
The image data is taken as an example of image data collected by a fisheye camera, the shooting angle of the fisheye camera is similar to a hemisphere, the obtained image is represented by a similar circle and the like, if the visual angle of the fisheye camera is 180 degrees, the shooting angle is just a hemisphere, and the obtained image is presented on a two-dimensional plane in a circle.
Fig. 7 is a schematic diagram of initial image data directly acquired by the fisheye camera provided in the embodiment of the present application, and a middle circular area is an initial image captured by the fisheye camera. In fig. 7, the fisheye camera faces the sky, and the captured image includes the sky, buildings, trees, and the like around the position where the fisheye camera is located.
203, the first processing unit renders the first image data into a first image under a large viewing angle, and displays the first image in a first window of the data display interface.
Specifically, the step of rendering the first image data into the first image at a large viewing angle by the first processing unit includes: the first processing unit renders the first image data into a first image at a large viewing angle according to the first projection matrix and the first image model.
It should be noted that the image models and the like referred to in the present application all refer to image models in a virtual scene. In a virtual scene, a coordinate system of an object is generally required to be constructed, and a model is established in the coordinate system of the object (commonly called modeling). The first image model established in the embodiment of the application is spherical; in other cases, the image models with different shapes may be corresponding to specific use scenes, such as a rectangular parallelepiped, and the first image may be a corresponding image on a certain surface of the rectangular parallelepiped. Taking the first image model as a sphere as an example, it can be simply understood that the first image model is a sphere formed by dividing the first image model into n circles according to longitude and allocating m points to each circle, where n is 180, m is 30, and the like. It should be noted that the larger n and the larger m, the more rounded the sphere formed.
After the model is built, a projection matrix can be constructed. In a virtual scene, a coordinate system in which an object (or a model, which is displayed as an object after texture mapping on the model) is located is called an object coordinate system, and a camera coordinate system is a three-dimensional coordinate system established with a focus center of a camera as an origin and corresponds to a world coordinate system. The virtual camera, the object, etc. are all in the world coordinate system. The relationships among the virtual camera, the object, the model in the world coordinate system, the wide angle and the elevation angle of the virtual camera, the distance from the lens to the near plane and the far plane, and the like are all embodied in the projection matrix.
Fig. 8 is a schematic diagram of imaging of perspective projection provided in the embodiment of the present application. Wherein the distance from the lens of the virtual camera to the near plane 11, i.e. the distance between the point 0 and the point 1, and the distance from the lens of the virtual camera to the far plane 12, i.e. the distance between the point 0 and the point 2. The position point of the virtual camera can be simply understood as the coordinate of the 0 point in the world coordinate system.
The first projection matrix may be determined by: acquiring set initial parameters of the first virtual camera, wherein the initial parameters comprise the position of the first virtual camera, the Euler angle, the distance from the lens of the first virtual camera to a projection plane (also called a near plane), the distance from the lens of the first virtual camera to a far plane and the like; a first projection matrix is determined from initial parameters of the first virtual camera. The first projection matrix is determined, for example, using a mathematical library based on initial parameters of the first virtual camera, for example, the initial parameters of the first virtual camera are input into a corresponding function of a glm (opengl mathematics) database, and the first projection matrix is calculated using the function. It should be noted that the first projection matrix determined according to the set initial parameters of the first virtual camera may also be understood as an initial first projection matrix. In the embodiment of the present application, since the initial first projection matrix is not changed all the time, the first projection matrix is the initial first projection matrix.
After the first image model and the first projection matrix are determined, the first image data are processed according to the first projection matrix and the first image model, and a first image under a large viewing angle is obtained. Specifically, a CPU obtains a first projection matrix and a first image model; the CPU sends the first projection matrix and the first image model to the first processing unit; the first processing unit renders the first image data into a first image under a large viewing angle according to the first projection matrix and the first image model. For example, a vertex in the first image model is sent to a vertex shader, a texture coordinate in the first image model is sent to a fragment shader, a texture unit corresponding to the texture coordinate is determined according to the first image data, and the first image under a large viewing angle is obtained by rendering through the first processing unit.
The large viewing angle refers to a viewing angle at which at least complete image data can be seen in the field of view after rendering. It can be simply understood that a large viewing angle is a viewing angle at which the first virtual camera is placed farther outside the first image model, so that the complete planar image corresponding to the first image model is seen within the field of view. The large view angle is essentially the view angle corresponding to the placement of the first image model into the viewing frustum of the first virtual camera. In a large viewing angle, the first virtual camera is located outside the first image model.
As shown in fig. 9, the first virtual camera 21 is located outside the first image model 20, i.e. the trapezoidal area between the near plane 22 and the far plane 23, and the first image model 20 is completely within the view frustum. In the step, the first image under a large viewing angle is obtained, so that a user can conveniently understand the content of the image data on the whole.
And after the first processing unit processes the first image under the large visual angle, the first image is displayed in a first window of the data display interface.
The data display interface comprises at least one first window and at least one second window. Referring to fig. 10, fig. 10 is a schematic diagram of a data display interface provided in an embodiment of the present application. The data presentation interface 30 includes a first window 31 located on the left side of the data presentation interface and two second windows 32 located on the right side of the first window 31. The bottom layer in the first window 31 shows a first image. As can be seen from fig. 10, the obtained first image corresponds/matches the image data. Wherein the first window and/or the second window may exist on the data presentation interface 30 in the form of a display control.
And 204, the central processing unit determines a second projection matrix according to the control operation of the user on a second window of the data display interface, and transmits the second projection matrix to the second processing unit and the third processing unit.
The user can perform control operation on the second window of the data display interface. The control operation can be realized by the sliding touch operation of the user on the second window; the method can also be realized in a voice mode, for example, voice is detected on a data display interface, the voice is identified to obtain instructions of 'slide left by 2 cm', and the like, and the control operation is completed according to the instructions; the control operation and the like can be realized according to the gesture by detecting the gesture of the user on the second window. The specific implementation of the control operation is not particularly limited.
In the embodiment of the present application, a sliding touch operation is described as an example. The event of the control operation corresponding to the sliding touch operation includes a sliding event, a click event, and the like. The slide event is used to control various conditions during the finger slide. The sliding event includes a finger-down event, a finger-moving event (including information such as a coordinate point of finger movement), a finger-up event, and the like. And the control operation of the user on the second window of the data display interface comprises a sliding event, a clicking event and the like triggered by the user on the second window.
And determining the control operation of the user on the second window of the data presentation interface by detecting the corresponding event triggered by the second window.
Wherein, like the first projection matrix, the second projection matrix also has an initial second projection matrix, i.e. an initial value of the second projection matrix. The corresponding second projection matrix is the initial second projection matrix, such as when the data presentation interface is turned on/refreshed. The initial second projection matrix may be determined by: acquiring set initial parameters of the second virtual camera, wherein the initial parameters comprise the position of the second virtual camera, the Euler angle, the distance from the lens of the second virtual camera to the near plane, the distance from the lens of the second virtual camera to the far plane and the like; an initial second projection matrix is determined from the initial parameters of the second virtual camera. The initial second projection matrix may also be preset. Wherein the initial first projection matrix and the initial second projection matrix are different. The value of the second projection matrix is not always the initial value, because the value of the second projection matrix is changed according to the control operation that the user can perform on the second window.
The user may perform a control operation on the currently presented second image within the second window, such as after opening the data presentation interface, to facilitate the user in viewing the area of interest.
Specifically, the step of determining the second projection matrix according to the control operation of the user on the second window of the data presentation interface includes: determining an operation parameter corresponding to the control operation according to the control operation of the user on a second window of the data display interface; a second projection matrix is determined based on the operating parameters. It will be appreciated that the second projection matrix determined from the operating parameters is an updated second projection matrix. Specifically, the updated second projection matrix, that is, the current second projection matrix, is determined according to the operation parameters and the second projection matrix determined last time. It is understood that, when the control operation is not finished, the control operation of the user on the second window of the data presentation interface is acquired.
The operating parameters determined according to the control operation include, for example, an operating acceleration, an operating distance, and the like. And determining an operation angle according to the operation parameters, and determining a second projection matrix according to the operation angle and the previously determined second projection matrix. If a control operation by the user on the second window 22 at the upper right side of fig. 6 is detected, the second projection matrix is determined according to the control operation.
It is understood that, in determining the second projection matrix according to the control operation (sliding touch operation) of the user on the second window, since the sliding touch operation changes, for example, the position where the finger slides changes all the time, the control operation also changes all the time, and thus the second projection matrix is also updated.
And after the second projection matrix is determined, transmitting the second projection matrix to a second processing unit and a third processing unit, so that the second processing unit and the third processing unit respectively perform different processing according to the second projection matrix.
205, the second processing unit generates a second image under a small viewing angle according to the second projection matrix and the second image data, and displays the second image in a second window.
Specifically, the second processing unit generates a second image under a small viewing angle according to a second projection matrix, a second image model and second image data, and displays the second image in a second window, wherein the second projection matrix is different from the first projection matrix, and the second image model is the same as the first graphic model.
Wherein the second image model may be predetermined. In this embodiment, the second image model is the same as the first image model, and the first image model can be directly obtained as the second image model.
The step of generating a second image under a small viewing angle according to the second projection matrix, the second image model and the second image data includes: the CPU obtains a second image model; the CPU transmits the second image model to the second processing unit; the second processing unit generates a second image under a small viewing angle according to the second projection matrix, the second image model and the second image data. Specifically, the CPU transmits a vertex in the second image model to the vertex shader, copies a texture coordinate in the second image model to the fragment shader, determines a texture unit corresponding to the texture coordinate according to the second image data, and performs rendering by using the second processing unit to generate the second image at the small viewing angle.
The small view angle refers to a view angle at which local image data can be seen in the view field after rendering. It can be simply understood that the small viewing angle is the viewing angle of the local planar image corresponding to the second image model projected in the view field by placing the second virtual camera inside the second image model.
As shown in fig. 11, the second virtual camera 41 is located inside the second image model 40, the view frustum is a trapezoidal region between the near plane 42 and the far plane 43, a part of the second image model 40 is in the view frustum, and a part in the view frustum is a target navigation region hereinafter. It is noted that the second image model 40 is identical to the first image model 20, and in this figure, the first image model 20 and the second image model 40 are merely schematic for convenience.
In the step, the second image under the small visual angle is obtained, so that the user can understand the content of the image data locally (under the small visual angle), and the understanding efficiency of the content of the image data is improved.
And after the second image under the small visual angle is generated, displaying the second image in a second window of the data display interface. It will be appreciated that since the second projection matrix is always updated so that the second image at small viewing angles generated from the second projection matrix and the second image model, the second image data, is also updated, the second image presented on the second window is also updated synchronously.
If there is only one second window 32 on the data presentation interface, then a second image at a small viewing angle is presented in the second window. If the data display interface includes a plurality of second windows 32, the second images at different small viewing angles are displayed on the plurality of second windows. In the plurality of second windows, the small viewing angles corresponding to each second window 32 are different, and the displayed second images are also displayed as different images.
In the above steps, the first window on the data display interface displays the first image under the large viewing angle, and the second window displays the second image under the small viewing angle, so that the planar image of the image data under different viewing angles is obtained, the image data can be understood from different viewing angles, the user can conveniently understand the content of the image data, and the understanding efficiency of the content of the image data is improved. And the control operation can be carried out in the second window so as to control the displayed second image according to the control operation, all visible areas under a small visual angle can be seen through the control operation, and a user can conveniently and quickly locate the concerned area from the second image. It can be understood that, if the control operation is performed on the second window, the second image displayed in the second window is constantly changing.
The first image and the second image are projected under a large visual angle and a small visual angle through the same image model (the first image model and the second image model are the same), and are mapped by using the same texture (image data). The image data is understood from the whole through the first image under the large visual angle, and the image data is understood from the local part through the second image under the small visual angle, so that the detail display of the image data is realized. And in the process of controlling the second window under the small visual angle, the second image under the small visual angle is continuously changed. And the second image model is spherical, 360 degrees and has no boundary, so that the second image is easy to repeat, namely the second image is easy to rotate in the process of controlling the second window. Therefore, when the user controls the second window, the user does not know which part of the first image corresponds to the second image displayed on the second window, so that the speed of positioning the concerned area by the user is reduced, and the user experience is seriously influenced. The embodiment of the present application solves this technical problem through step 206.
And 206, the third processing unit determines a target navigation image according to the second projection matrix so as to display the target navigation image in a highlighted mode in the first window, wherein the target navigation image represents the position information of the second image under the small visual angle in the first image under the large visual angle.
By displaying the target navigation image in the first window, when the user controls the second window, the user can clearly know which part of the first image corresponds to the second image displayed on the current second window, so that the speed of positioning the concerned area by the user is improved, and the user experience is improved.
In one case, the step of determining the target navigation image according to the second projection matrix by the third processing unit to highlight the target navigation image in the first window includes: the third processing unit determines a target navigation area of the second image in the first image model corresponding to the first image according to the second projection matrix; the third processing unit processes the target navigation area to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode.
The step of determining a target navigation area of the second image in the first image model corresponding to the first image according to the second projection matrix includes: and determining a target navigation area of the second image in the first image model corresponding to the first image according to the second projection matrix and the first image model.
It is to be understood that the first image or the second image determined according to the projection matrix (the first projection matrix and the second projection matrix, respectively) and the image model (the first image model and the second image model, respectively) as described above is an image obtained by the imaging principle of perspective projection. As shown in fig. 8, the projection of a point in the image model between the near plane 11 and the far plane 12 can be seen in our field of view.
According to the imaging principle of perspective projection, the visible part of the visual field is the vertex on the image model multiplied by the projection matrix, and the vertex on the near plane is normalized, cut and finally displayed by texture mapping. Therefore, if one wants to determine the target navigation area of the second image in the first image model corresponding to the first image, the problem can be converted into: and determining which vertexes on the first image model can be projected onto the near plane of the second projection matrix, taking areas corresponding to the vertexes as target navigation areas after the vertexes are determined, highlighting and specifying texture coordinates corresponding to the target navigation areas, and rendering and displaying. Further, if it is desired to determine which vertices of the first image model can be projected onto the near plane of the second projection matrix, this can be determined by the second projection matrix and the first image model. Wherein the target navigation area refers to a target area of the second image in the first image model. The points in the target navigation area are all points in the first image model, which are three-dimensional points.
In one case, the step of determining, from the second projection matrix and the first image model, that the second image corresponds to the target navigation area within the first image comprises: the CPU obtains a first image model; sending the first image model to a third processing unit; the third processing unit determines a target vertex projected to a near plane corresponding to the second projection matrix from the vertex of the first image model according to the second projection matrix and the first image model; and taking the area corresponding to the target vertex as a target navigation area of the second image in the first image model corresponding to the first image. The area corresponding to the target vertex is understood as the area where the target vertex is located.
The target vertices are understood as vertices in the first image model that can be projected into the near plane of the second projection matrix. The method comprises the following steps of determining a target vertex projected to a near plane corresponding to a second projection matrix from a vertex of a first image model according to the second projection matrix and the first image model, and specifically comprises the following steps: the third processing unit determines the coordinates of the projected vertexes of the first image model according to the second projection matrix, and obtains the coordinates of each vertex after projection if the vertex in the first image model is multiplied by the second projection matrix; and the third processing unit determines a target vertex projected to a projection plane corresponding to the second projection matrix according to the coordinate of the first image model after the vertex projection. The step of determining a target vertex projected to a projection plane corresponding to the second projection matrix according to the coordinate of the first image model after the vertex projection comprises the following steps: the third processing unit detects whether the projected coordinates of each vertex are in the range of the projection surface corresponding to the second projection matrix; if yes, determining the vertex as a target vertex; and if not, determining the vertex as a non-target vertex. Wherein the target vertices are viewable by the user after being projected onto the near plane of the second projection matrix, and the non-target vertices are not viewable by the user after being projected.
Specifically, if the first image model is divided into 180 circles by longitude and 30 points are assigned to each circle, the number of vertices is 180 × 30. And the third processing unit takes all the vertex coordinates as a matrix, multiplies the second projection matrix and the vertex coordinate matrix of the vertex to determine the projected coordinates of the vertex, determines the projected coordinates as a target vertex if the projected coordinates are in the range of a near plane corresponding to the second projection matrix, and otherwise determines the vertex as a non-target vertex. It is understood that after the second projection matrix is determined, the range of the near plane corresponding to the second projection matrix is also determined. If projected coordinate (x)1,y1,z1) X in (2)1And y1Is in the [ -1,1 ] coordinate]In the range of-1. ltoreq. x1Y is not more than 1, and-1 is not more than y1And if the projection coordinate is less than or equal to 1, determining that the projected coordinate is in the range of the near plane corresponding to the second projection matrix. And after the target vertex is determined, taking the area corresponding to the target vertex as a target navigation area in the first image model corresponding to the second image. It should be noted that the projected z1 coordinates need not be determined here, so the near plane is two-dimensional and all z-axis coordinates are equal. The projected z1 coordinates are then used as depth of field to achieve a near-far effect.
The method can be simply understood that a first image is obtained by multiplying a first projection matrix and the vertex of a first image model outside the first image model, cutting and rendering the product and the like; multiplying the second projection matrix with the vertex of the second image model in the second image model, and obtaining a second image after clipping, rendering and the like; then, after multiplying the internal second projection matrix with the first image model, it can be derived which vertices in the first image model can be projected onto the near plane of the second projection matrix, and the obtained vertices are used as target vertices.
It is noted that the above determination of the second image corresponding to the target navigation area within the first image model based on the second projection matrix and the first image model is performed by the third processing unit. The third processing unit calculates the coordinate of the first image model after the vertex projection in a matrix mode, so that the processing speed is greatly improved, and the power consumption of the mobile terminal is reduced. It can be understood that if the CPU is used for calculation, the CPU is required to traverse each vertex in the first image conversion model, that is, the number of traversed vertices is 180 × 30, and each time a vertex is traversed, the coordinates after vertex projection are calculated according to the second projection matrix and the vertex, so that the processing speed is greatly reduced, and the power consumption of the mobile terminal is high. On the other hand, the coordinates of the vertex of the first image model after projection are calculated, if CPU calculation is adopted, the CPU floating point calculation efficiency is not high, and the error is larger; and the GPU is specially used for processing floating point operation, so that the efficiency is high, and the processing accuracy is greatly improved. It can be understood that, in the fragment shader of the third processing unit of the GPU, the vertex and the texture coordinate of the first image model may be simultaneously transmitted, the second projection matrix may be transmitted, and whether the vertex of the first image model is the target vertex (and then the value of the transparency is directly adjusted) may be determined.
After the target navigation area is determined, the third processing unit processes the target navigation area to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode, wherein the target navigation image represents position information of the second image under the small visual angle in the first image under the large visual angle.
As can be seen from the above description, if the user performs a sliding touch operation on the second window, so that the control operation is changed, the second projection matrix determined according to the control operation is also updated synchronously, the second image generated according to the second projection matrix is also updated, the corresponding target navigation area is also updated, and the target navigation image obtained by processing the target navigation area is also updated; the target navigation image represents the position information of the second image in the first image, i.e. the target navigation image displayed in the first window is updated all the time.
After the third processing unit determines the target vertex, determining a texture coordinate corresponding to the target vertex; and the third processing unit processes the target navigation area in a preset mode according to the texture coordinates to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode. And a target navigation image representing a position of the second image within the first image.
It should be noted that, if the CPU is used for processing, after the CPU determines the target vertex and the texture coordinate corresponding to the target vertex, the texture coordinate needs to be copied to a third processing unit in the GPU, so that the GPU processes the target navigation area according to the texture coordinate, so as to highlight the target navigation area in the first window. By adopting the scheme in the embodiment of the application, the third processing unit determines the target vertex and the corresponding texture coordinate without copying the texture coordinate, so that a large amount of time from CPU to GPU copying is saved, the processing efficiency is further improved, and the power consumption of the mobile terminal is further reduced.
The third processing unit processes the target navigation area in a preset mode according to the texture coordinate to obtain a target navigation image so as to highlight the target navigation image in the first window, and the method comprises the following steps of: the third processing unit processes the target navigation area in a preset mode according to the texture coordinates to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode, and the third processing unit comprises the following steps: acquiring a preset texture and a first preset transparency of a target navigation area, wherein the preset texture of the target navigation area comprises a preset color or a preset picture; and the third processing unit renders the target navigation area according to the preset texture, the first preset transparency and the texture coordinate of the target navigation area to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode. Specifically, the third processing unit sets the texture corresponding to the texture coordinate as a target navigation area preset texture, and sets the transparency of the target navigation area preset texture as a first preset transparency; the third processing unit renders the target navigation area according to the set texture. Therefore, the target navigation area is rendered into the target navigation area preset texture, and the displayed transparency is the first preset transparency, so that the purpose of highlighting the position of the second image in the first image is achieved.
Further, the step of taking a region outside the target navigation region as a non-target navigation region, or taking a region corresponding to a non-target vertex as a non-target navigation region, specifically, the third processing unit processes the target navigation region according to the texture coordinates in a preset manner to obtain a target navigation image, so as to display the target navigation image in the first window in a highlighted manner, includes:
acquiring a preset texture, a first preset transparency and a second preset transparency of a target navigation area, wherein the second preset transparency is smaller than the first preset transparency, and the preset texture of the target navigation area is a preset color or a preset picture; the third processing unit renders the target navigation area according to the preset texture, the first preset transparency and the texture coordinate of the target navigation area to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode; and the third processing unit renders the non-target navigation area into a second preset transparency. The third processing unit renders the target navigation area according to the preset texture, the first preset transparency and the texture coordinate of the target navigation area, and specifically includes: and setting the texture corresponding to the texture coordinate as a preset texture of the target navigation area, setting the transparency of the texture corresponding to the texture coordinate as a first preset transparency, and rendering the target navigation area by the third processing unit according to the set texture. And rendering the target navigation area into a preset texture of the target navigation area, wherein the displayed transparency is the first preset transparency, so that the purpose of highlighting the position of the second image in the first image is achieved.
It is understood that if the target navigation area is rendered after the first image, the target navigation image is displayed on top of the first image. In order to not block the region corresponding to the non-target navigation region in the first image and improve the display effect, the second preset transparency is set to be less than 0.8, for example, the second preset transparency may be set to be 0. In order to highlight the target navigation image, the first preset transparency may be set to be between (0,1), and in order to not completely cover the area corresponding to the target navigation image in the first image, so as to improve the user experience, the first preset transparency may be set to be 0.8. Wherein the preset color may be set to red to highlight the target navigation image.
As shown in the left diagram of fig. 10, the target navigation image 33 and the rendered non-target navigation area are located on the first image, and the current first preset transparency of the target navigation image 33 is not 1, a partial area corresponding to the first image located below the target navigation image 33 can be seen through the target navigation image 33, and a partial area corresponding to the first image located below the target navigation image 33 is consistent with the second image. Since the second preset transparency is 0, the rendered non-target navigation area is transparent and cannot be seen by human eyes.
In some other cases, the step of taking a region outside the target navigation region as a non-target navigation region, or taking a region corresponding to a non-target vertex as a non-target navigation region, specifically, the third processing unit processes the target navigation region in a preset manner according to the texture coordinates to obtain a target navigation image, so as to display the target navigation image in the first window in a highlighted manner, includes:
acquiring a target navigation area preset texture, a first preset transparency, a non-target navigation area preset texture and a second preset transparency, wherein the second preset transparency is smaller than the first preset transparency, the target navigation area preset texture is a first preset color or a first preset picture, and the non-target navigation area preset texture is a second preset color or a second preset picture; the third processing unit renders the target navigation area according to the preset texture, the first preset transparency and the texture coordinate of the target navigation area to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode; and the third processing unit renders the non-target navigation area according to the preset texture and the second preset transparency of the non-target navigation area.
Wherein, the setting of the first preset transparency and the second preset transparency can refer to the description above; the preset texture of the target navigation area and the preset texture of the non-target navigation area can be the same or different. And highlighting the target navigation area, rendering the non-target navigation area by using the preset texture of the non-target navigation area, and setting the preset transparency as a second preset transparency.
In the above embodiment, the target navigation area and the non-target navigation area are distinguished, and the target navigation image is further highlighted, that is, the position of the second image in the first image is further highlighted, so that the user experience is improved.
It should be noted that there may be a plurality of implementation scenarios of the step of processing the target navigation area by the third processing unit according to the texture coordinates to obtain the target navigation image, so as to highlight the target navigation image in the first window.
For example, in one implementation scenario, there is only one display control in the first window, and the target navigation image (and the rendered non-target navigation area) may be displayed through the display control, or the first image may be displayed. If the display control comprises two texture units: a first texture unit and a second texture unit. Wherein the first texture unit is used to display the first image, the second texture unit is used to display the target navigation image (and the rendered non-target navigation area), and the second texture unit is located on the first texture unit. Specifically, before the step of displaying the first image in the first window of the data display interface, the method further includes: acquiring a first texture unit and a second texture unit in a display control of a first window; the second texture unit is disposed on the first texture unit. Thus, the step of displaying the first image in the first window of the data display interface includes: the first image is presented within a first texture unit in a display control of a first window. The step of highlighting the target navigation image within the first window comprises: the target navigation image (and rendered non-target navigation area) is highlighted within a second texture unit in the display control of the first window. It should be noted that, in this case, while the step of processing the target navigation area in a preset manner to obtain the target navigation image and highlighting the target navigation image in the second texture unit of the first window display control is executed, the step of rendering the first image data into the first image at the large viewing angle and the step of showing the first image in the first texture unit of the first window display control are also executed synchronously. It is understood that, because the first image and the target navigation image are displayed in one display control, the first image and the target navigation image (and the non-target navigation area) are rendered simultaneously, and if only the target navigation image (and the non-target navigation area) is rendered, the first image will not be displayed in the first window, which does not achieve the purpose of the present application. In this way, when the target navigation area is processed in the preset mode, the target navigation area (and the non-target navigation area) in the second texture unit is rendered, and the first image corresponding to the first texture unit is rendered.
For example, in another implementation scenario, two display controls exist in the first window, the first display control is used for displaying the first image, and the second display control is used for displaying the target navigation image (and the processed non-target navigation area). Specifically, before the step of displaying the first image in the first window of the data display interface, the method further includes: acquiring a first display control and a second display control in a first window; the second display control is disposed over the first display control. Thus, the step of displaying the first image in the first window of the data display interface includes: and displaying the first image in a first display control of a first window of the data display interface. The step of highlighting the target navigation image within the first window comprises: the target navigation image (and rendered non-target navigation area) is highlighted in the second display control of the first window. In this way, the first image and the target navigation image (and the rendered non-target navigation area) are displayed through the two display controls respectively, and are processed separately, so that the processing efficiency is improved. If the target navigation area is processed, only the content displayed on the second display control needs to be rendered, and the content displayed on the first display control does not need to be rendered, so that the consumption of the mobile terminal is reduced, and the processing efficiency and speed are improved.
It should be noted that the above steps 205 and 206 can be executed in series or in parallel; and executed in parallel to improve processing efficiency.
Through the scheme, the second image displayed on the current second window is highlighted corresponding to the position in the first image, the user can clearly know the second image displayed in the second window according to the target navigation image and the position of the second image in the first image displayed in the first window, so that the incidence relation between the images at different visual angles is established, the understanding efficiency of the image data content is further improved, the user can conveniently adjust the watched area, the user can conveniently guide the user to quickly find the concerned area, the speed of the user for positioning the concerned area in the image data is improved, and the user experience is improved. In addition, the second image displayed through the second window also realizes the detail display of the image data. The data display method in the embodiment of the application can be applied to more application scenes.
It should be noted that the first projection matrix and the second projection matrix in the embodiment of the present application correspond to an MVP matrix, where MVP is a predictive view model. The model matrix corresponds to an operation matrix of the image model. The view matrix mainly corresponds to the position, orientation and the like of the virtual camera, and the prestctive matrix corresponds to the information of the Euler angle, the near plane, the far plane and the like of the virtual camera.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions (computer programs) which are stored in a computer-readable storage medium and loaded and executed by a processor, or by related hardware controlled by the instructions (computer programs). To this end, an embodiment of the present application provides a storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps of any embodiment of the data presentation method provided in the embodiment of the present application.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any data presentation method embodiment provided in the present application, the beneficial effects that can be achieved by any data presentation method provided in the present application embodiment can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The data display method, the mobile terminal and the storage medium provided by the embodiment of the present application are introduced in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiment is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. The data display method is characterized by being applicable to a mobile terminal, wherein the mobile terminal comprises a central processing unit, a memory, a first processing unit, a second processing unit and a third processing unit, wherein the first processing unit, the second processing unit and the third processing unit are operated on a graphic processor; the data display method comprises the following steps:
the central processing unit determines the size and the position of a first window and a second window in the data display interface according to historical operation data;
the central processing unit copies image data to obtain first image data and second image data, and transmits the first image data to the first processing unit and transmits the second image data to the second processing unit;
the first processing unit renders the first image data into a first image under a large visual angle, and displays the first image in a first window of a data display interface;
the central processing unit determines a second projection matrix according to control operation of a user on a second window of the data display interface, and transmits the second projection matrix to the second processing unit and the third processing unit;
the second processing unit generates a second image under a small visual angle according to the second projection matrix and the second image data, and displays the second image in the second window;
the third processing unit determines a target navigation image according to the second projection matrix so as to display the target navigation image in the first window in a protruding mode, wherein the target navigation image represents position information of the second image under a small visual angle in the first image under a large visual angle.
2. The data presentation method of claim 1, wherein the step of the central processor determining the size and position of the first window and the second window in the data presentation interface based on the historical operating data comprises:
the central processing unit generates an initial interface corresponding to the data display interface;
receiving historical operation data of the current user through the initial interface;
dividing the initial interface into at least two display areas with different operation heat degrees according to triggered historical operation data in each unit area in the initial interface;
marking each display window as a display window with different operation demand degrees according to historical operation frequency data of each display window in image data display, wherein the display windows comprise a first window and a second window;
and displaying a display window corresponding to the operation demand degree in the display areas of different operation heat degrees corresponding to the initial interface according to the corresponding relation between the operation heat degrees and the operation demand degree to obtain the data display interface.
3. The data presentation method of claim 1, wherein the step of determining, by the third processing unit, a target navigation image according to the second projection matrix for highlighting the target navigation image in the first window comprises:
the third processing unit determines a target navigation area of the second image in a first image model corresponding to the first image according to the second projection matrix;
and the third processing unit processes the target navigation area to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode.
4. The data display method of claim 3, wherein the step of processing the target navigation area by the third processing unit to obtain a target navigation image so as to highlight the target navigation image in the first window comprises:
the third processing unit determines texture coordinates of a target vertex corresponding to the target navigation area;
and the third processing unit processes the target navigation area in a preset mode according to the texture coordinates to obtain a target navigation image so as to display the target navigation image in a first window in a protruding mode.
5. The data presentation method of claim 1, wherein the step of rendering the first image data into a first image at a large viewing angle by the first processing unit comprises:
the first processing unit renders the first image data into a first image under a large viewing angle according to a first projection matrix and a first image model.
6. The data presentation method of claim 1, wherein the step of generating a second image at a small viewing angle by the second processing unit according to the second projection matrix and the second image data comprises:
and the second processing unit generates a second image under a small visual angle according to the second projection matrix, a second image model and the second image data, wherein the second image model is the same as the first image model corresponding to the first image.
7. A mobile terminal, characterized in that the mobile terminal comprises: one or more central processors; a memory; one or more graphics processors, and one or more computer programs, wherein the central processor is coupled to the memory, the graphics processors, the one or more computer programs being stored in the memory and configured to be executed by the central processor and the graphics processors to:
the central processing unit determines the size and the position of a first window and a second window in the data display interface according to historical operation data;
the central processing unit copies image data to obtain first image data and second image data, and transmits the first image data to the first processing unit and transmits the second image data to the second processing unit;
the first processing unit renders the first image data into a first image under a large visual angle, and displays the first image in a first window of a data display interface;
the central processing unit determines a second projection matrix according to control operation of a user on a second window of the data display interface, and transmits the second projection matrix to the second processing unit and the third processing unit;
the second processing unit generates a second image under a small visual angle according to the second projection matrix and the second image data, and displays the second image in the second window;
the third processing unit determines a target navigation image according to the second projection matrix so as to display the target navigation image in the first window in a protruding mode, wherein the target navigation image represents position information of the second image under a small visual angle in the first image under a large visual angle.
8. The mobile terminal of claim 7, wherein the central processor further performs the steps of:
the central processing unit generates an initial interface corresponding to the data display interface;
receiving historical operation data of the current user through the initial interface;
dividing the initial interface into at least two display areas with different operation heat degrees according to triggered historical operation data in each unit area in the initial interface;
marking each display window as a display window with different operation demand degrees according to historical operation frequency data of each display window in image data display, wherein the display windows comprise a first window and a second window;
and displaying a display window corresponding to the operation demand degree in the display areas of different operation heat degrees corresponding to the initial interface according to the corresponding relation between the operation heat degree and the operation demand degree to obtain a data display interface.
9. The mobile terminal according to claim 7, wherein the third processing unit of the graphics processor, when executing the step of determining the target navigation image according to the second projection matrix to highlight the target navigation image in the first window, specifically executes the following steps:
the third processing unit determines a target navigation area of the second image in a first image model corresponding to the first image according to the second projection matrix;
and the third processing unit processes the target navigation area to obtain a target navigation image so as to display the target navigation image in the first window in a protruding mode.
10. The mobile terminal according to claim 9, wherein the third processing unit, when executing the step of processing the target navigation area to obtain a target navigation image, so as to highlight and display the target navigation image in the first window, specifically executes the following steps:
the third processing unit determines texture coordinates of a target vertex corresponding to the target navigation area;
and the third processing unit processes the target navigation area in a preset mode according to the texture coordinates to obtain a target navigation image so as to display the target navigation image in a first window in a protruding mode.
CN202011116845.9A 2020-10-19 2020-10-19 Data display method and mobile terminal Active CN112308757B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011116845.9A CN112308757B (en) 2020-10-19 2020-10-19 Data display method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011116845.9A CN112308757B (en) 2020-10-19 2020-10-19 Data display method and mobile terminal

Publications (2)

Publication Number Publication Date
CN112308757A true CN112308757A (en) 2021-02-02
CN112308757B CN112308757B (en) 2024-03-22

Family

ID=74327858

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011116845.9A Active CN112308757B (en) 2020-10-19 2020-10-19 Data display method and mobile terminal

Country Status (1)

Country Link
CN (1) CN112308757B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080055336A1 (en) * 2006-08-31 2008-03-06 Canon Kabushiki Kaisha Image data management apparatus, image data management method, computer-readable storage medium
US20080058012A1 (en) * 2006-08-30 2008-03-06 Canon Kabushiki Kaisha Image processing apparatus, mobile terminal apparatus, image processing system and control method
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
WO2019076371A1 (en) * 2017-10-20 2019-04-25 维沃移动通信有限公司 Resource data display method and mobile terminal
CN110163942A (en) * 2018-07-18 2019-08-23 腾讯科技(深圳)有限公司 A kind of image processing method and device
CN111145352A (en) * 2019-12-20 2020-05-12 北京乐新创展科技有限公司 House live-action picture display method and device, terminal equipment and storage medium
CN111198610A (en) * 2018-11-16 2020-05-26 北京字节跳动网络技术有限公司 Method, device and equipment for controlling field of view of panoramic video and storage medium
CN111200750A (en) * 2018-11-16 2020-05-26 北京字节跳动网络技术有限公司 Multi-window playing method and device of panoramic video, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080058012A1 (en) * 2006-08-30 2008-03-06 Canon Kabushiki Kaisha Image processing apparatus, mobile terminal apparatus, image processing system and control method
US20080055336A1 (en) * 2006-08-31 2008-03-06 Canon Kabushiki Kaisha Image data management apparatus, image data management method, computer-readable storage medium
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
WO2019076371A1 (en) * 2017-10-20 2019-04-25 维沃移动通信有限公司 Resource data display method and mobile terminal
CN110163942A (en) * 2018-07-18 2019-08-23 腾讯科技(深圳)有限公司 A kind of image processing method and device
CN111198610A (en) * 2018-11-16 2020-05-26 北京字节跳动网络技术有限公司 Method, device and equipment for controlling field of view of panoramic video and storage medium
CN111200750A (en) * 2018-11-16 2020-05-26 北京字节跳动网络技术有限公司 Multi-window playing method and device of panoramic video, electronic equipment and storage medium
CN111145352A (en) * 2019-12-20 2020-05-12 北京乐新创展科技有限公司 House live-action picture display method and device, terminal equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴军;王玲容;黄明益;彭智勇;: "多几何约束下的鱼眼相机单像高精度标定", 光学学报, no. 11, pages 199 - 210 *

Also Published As

Publication number Publication date
CN112308757B (en) 2024-03-22

Similar Documents

Publication Publication Date Title
WO2020207202A1 (en) Shadow rendering method and apparatus, computer device and storage medium
CN111833243B (en) Data display method, mobile terminal and storage medium
EP3832605B1 (en) Method and device for determining potentially visible set, apparatus, and storage medium
CN111813290B (en) Data processing method and device and electronic equipment
CN112017133B (en) Image display method and device and electronic equipment
EP3618006B1 (en) Image processing method and apparatus
CN109726368B (en) Map marking method and device
CN112150560B (en) Method, device and computer storage medium for determining vanishing point
CN108665510B (en) Rendering method and device of continuous shooting image, storage medium and terminal
CN112308766B (en) Image data display method and device, electronic equipment and storage medium
WO2019227485A1 (en) Augmented reality method for simulating wireless signal, and apparatus
CN112308768B (en) Data processing method, device, electronic equipment and storage medium
CN112308767B (en) Data display method and device, storage medium and electronic equipment
CN109842722B (en) Image processing method and terminal equipment
CN112308757B (en) Data display method and mobile terminal
CN112181230A (en) Data display method and device and electronic equipment
CN112306344B (en) Data processing method and mobile terminal
CN112184543B (en) Data display method and device for fisheye camera
CN112184801A (en) Data display method for fisheye camera and mobile terminal
CN113936096A (en) Customized rendering method and device of volume cloud and storage medium
CN115272604A (en) Stereoscopic image acquisition method and device, electronic equipment and storage medium
CN110941389A (en) Method and device for triggering AR information points by focus
JP7465976B2 (en) Collision range determination method, device, equipment, and computer program thereof
CN113673275B (en) Indoor scene layout estimation method and device, electronic equipment and storage medium
CN117348060A (en) Fault line determining method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant