Detailed Description
The present invention is described in detail below by way of specific examples so that those skilled in the art can easily practice the present invention in light of the present disclosure. The embodiments described below are only some, but not all, embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without any inventive effort, based on the embodiments described in the present specification, fall within the scope of the present invention. In addition, embodiments and features of embodiments in the present specification may be combined with each other without conflict.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise. The terms "first," "second," and the like, as used herein, are used merely to distinguish between different features, steps, operations, elements and/or components, and the like, and do not denote any particular technical meaning nor necessarily denote a logical order between them. The word "plurality" as used herein may refer to two or more than two, and the word "at least one" may refer to one, two or more than two. Any feature, step, operation, element, and/or component mentioned herein may be generally understood as one or more unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. The term "and/or" as used herein includes any or all combinations of one or more of the associated listed items. The element suffixes "module" and "unit" are used herein for convenience of description only, and thus, can be used interchangeably without any distinguishing meaning or function.
Detailed descriptions thereof will be omitted when the prior art related to the description of the present invention is obvious to those skilled in the art. It should also be understood that the description of the embodiments in this specification focuses on highlighting differences between the embodiments, and that the same or similar features between the embodiments may be referred to each other, and for brevity, the description is not repeated here.
As schematically shown in fig. 1, an exemplary system architecture 100 of one embodiment of a method for selecting a browsing point location in a virtual space based on an external device, which may be suitable for use with the present invention, is shown. The system architecture 100 may include terminal devices 101, 102, 103, as well as a network 104 and a server 105. The network 104 is used to provide communications between the terminal devices 101, 102, 103 and the server 105, which may include various connection types, such as wired, wireless communications, or fiber optic cables, etc.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103. Various communication client applications, such as image and video capturing applications, text input applications, web browser applications, professional field application software, search class applications, instant messaging tools, mailbox clients, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
In a specific implementation, the terminal devices 101, 102, 103 may be implemented as hardware or as software according to actual needs. When the terminal devices 101, 102, 103 are implemented as hardware, they may be various electronic devices having (touch) display screens and supporting various inputs of voice, text, etc., including, but not limited to, personal computers (including notebook computers and desktop computers), tablet computers, smartphones, car terminals, electronic book readers, video players, etc. When the terminal devices 101, 102, 103 are implemented as software, they may be installed in a suitable electronic device, implemented as a plurality of software or software modules (e.g. for providing distributed services), or as a single software or software module. It should be understood that the terminal device 101, 102, 103 examples in fig. 1 and described above are by way of example only and should not be construed as being particularly limiting.
The server 105 may be a server providing various services, for example, a background server providing processing of analysis, response, support, etc. of various information, such as control signals, voice, or text information, input to the terminal devices 101, 102, 103. The background server may analyze and process the received control signal, voice or target text, and the like, and feed back the processing result to the terminal devices 101, 102, 103 through the network 104.
In a specific implementation, the server 105 may be implemented as hardware or as software according to actual needs. When the server 105 is implemented as hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is implemented as software, it may be implemented as a plurality of software or software modules (e.g., to provide distributed services), or as a single software or software module. It should be understood that the server 105 example in fig. 1 and described above is by way of example only and should not be construed as being limiting in detail.
It should be noted that, the method for selecting the browsing point location in the virtual space based on the external device provided in the embodiment of the present application may be executed by the terminal devices 101, 102, 103, or may be executed by the server 105, or may be executed by the terminal devices 101, 102, 103 and the server 105 together in cooperation. Accordingly, the device for selecting the browsing point location in the virtual space based on the external device may be set in the terminal devices 101, 102, 103, or may be set in the server 105, or may be set in the terminal devices 101, 102, 103 and the server 105.
It can be appreciated that, when the method for selecting the browsing point location in the virtual space based on the external device provided in the embodiment of the present application is executed by the terminal devices 101, 102, 103, the system architecture 100 may not include the network 104 and the server 105.
It should be understood that the number and types of terminal devices, networks, and servers in fig. 1 are merely illustrative. In particular implementations, there may be any number and variety of terminal devices, networks, and servers, as desired.
As shown in fig. 2, a method for selecting a browsing point location in a virtual space based on an external device according to an embodiment of the present invention includes the following steps: step S201, collecting control signals of external equipment; step S202, confirming the current browsing point position according to the control signal of the external equipment acquired in the step S201; step S203, constructing a virtual camera; step S204, updating a transformation matrix of the virtual camera in a virtual space, namely a scene projection matrix (Projection Matrix), according to the control signals of the external equipment acquired in the step S201; step S205, updating the virtual scene with the current browsing point position as the virtual camera position according to the transformation matrix updated by the virtual camera in the virtual space, wherein the virtual scene can be easily generated and updated by a person skilled in the art by using a currently known tool, for example, the virtual scene can be generated and updated by WebGL, so that the description will not be repeated in detail; step S206, the preset browsing points in the virtual space are obtained, and similarly, the preset browsing points in the virtual space can be easily obtained by a person skilled in the art by using the existing known tools, so that the description is not repeated here; step S207, constructing the view cone of the virtual camera constructed in the step S203; step S208, on the basis of the steps S205 and S206, the browsing point positions in the viewing cone constructed in the step S207 are initially selected as alternative browsing point positions; step S209, respectively calculating the distances between each alternative browsing point position initially selected in the step S208 and the current browsing point position; and step S210, selecting the alternative browsing point position closest to the selected browsing point position as the next browsing point position according to the result calculated in the step S209.
In the step S201, the control signal of the external device may be continuously collected by polling. The external device can be various common or unusual external devices capable of inputting control signals, such as a mouse, a touch screen, a rotary rocker, a game handle, a sensor capable of sensing human body limb movement or facial expression and the like, which can be connected by wires through interfaces such as USB (universal serial bus), HDMI (high-definition multimedia interface) and the like or can be connected by wireless such as Bluetooth, infrared and the like.
In step S202 described above, a specific input of the external device, for example, a specific button of the external device, may be polled. When the control signal value corresponding to the specific input is found to be valid, for example, a preset value is reached, the current browsing point position can be confirmed by considering the control signal value as a confirmation operation.
In the step S203, the virtual camera may be constructed by WebGL, and those skilled in the art will be able to construct the virtual camera by WebGL, which is a conventional and common operation, so this description will not be repeated here.
Referring to fig. 3, a specific flowchart of updating the transformation matrix of the virtual camera in the virtual space according to the control signal of the external device in step S204 is shown. As shown in fig. 3, in step S301, it is determined whether the control signal of the external device acquired in the polling manner in step S201 is zero, and if the control signal is zero, then the next polling is continued; if it is found by polling that a control signal or signals of the external device is/are not zero, the encapsulated library is used to convert the control signal or signals that are not zero into a parameter of the virtual camera in step S302, for example, the product of the interval duration of two polls and the change value of the control signal is calculated, and this is taken as the rotation angle value of the virtual camera, for example, in the case that the external device is a rocker, the rocker is set to be rotatable in its own axial direction, and the rotation has a corresponding signal output port. Thus, the port can output signals with the intensity in the range of [0,1] according to the rotation angle of the rocker. When the port is polled, the frequency of the polling can be set to be N (fps), so that the product between the signal output intensity and the time interval of 1/N can be obtained in each 1/N, and the accumulated rotation angle (rotation angle value) at the moment can be obtained according to the integral concept in the calculus at any moment; then, in step S303, a transformation matrix of the virtual camera in the virtual space is updated according to the rotation angle value of the virtual camera. The virtual camera can update the picture of the cut virtual space accordingly, thereby helping the user control the picture of the virtual space which the user wants to see.
The steps S207 to S210 will be described in detail below with reference to fig. 4 to 6. Wherein fig. 4 shows a perspective view of a virtual camera view cone, fig. 5 shows a view cone cross-section taken along a plane formed by the dashed lines shown in fig. 4, and fig. 6 shows a plurality of browsing points located in a virtual space and a schematic view of the position relationship with the virtual camera view cone shown in fig. 4.
The construction of the viewing cone of the virtual camera in step S207 is a routine operation of those skilled in the art, so that the specific construction method and process thereof will not be repeated in the present specification. Fig. 4 and 5 show schematic views of a virtual camera View cone, wherein the virtual camera is located at a position O in the figure, a Near clipping plane (Near plane, also referred to as "Near clipping plane", etc.) of the virtual camera View cone is represented by a rectangle ABCD, a Far clipping plane (Far plane, also referred to as "Far clipping plane", etc.) is represented by a rectangle ABCD, and a Field angle (Field of View, fov) is represented by Fov in fig. 5. The dashed-dotted arrow OP in fig. 4 indicates the line-of-sight direction of the virtual camera, which originates from the virtual camera position O, through the center points of the near clipping surface ABCD and the far clipping surface ABCD. The dashed line MN in fig. 5 represents a horizontal plane, and the triangle EOF in fig. 5 corresponds to the vertical plane EOF passing through the position O represented by the dashed line in fig. 4, which is perpendicular to the horizontal plane.
As shown in fig. 6, for clarity, six browsing points R1, R2, R3, R4, R5, R6 in the virtual space are schematically shown, and it is easily understood that any number of browsing points may be preset in the virtual space in the implementation. In step S208, the browsing point location in the virtual camera view cone is initially selected as an alternative browsing point location. Specifically, the browsing points R3 and R6 in fig. 6 are not between the near clipping plane and the far clipping plane, and thus are eliminated and are not used as alternative browsing points. The present specification refers to a vector formed by the virtual camera position O and the browsing points as browsing point vectors, whereby each browsing point R1, R2, R3, R4, R5, R6 has its corresponding browsing point vector (browsing point vector OR1 of browsing point R1 is shown only by way of example with a two-dot chain arrow in the figure), and it will be readily understood that each browsing point vector will form an angle with the virtual camera line-of-sight direction OP, respectively. In addition, as can be seen from fig. 6, the included angle between the browsing point position vector OR1 of the browsing point position R1 and the virtual camera sight line direction OP is greater than one half of the sight angle Fov, so that the browsing point position R1 is also removed and is not used as an alternative browsing point position. In this way, the browsing points R1, R3, R6 are eliminated, and the remaining three browsing points R2, R4, R5 are used as candidate browsing points.
In step S209, the distances between the candidate browsing points R2, R4, R5 initially selected in step S208 and the current browsing point (i.e. the position O of the virtual camera) confirmed in step S202 are calculated. Subsequently, in step S210, the browsing point R4 closest to the browsing point is selected as the next browsing point.
According to a preferred embodiment of the present invention, in order to further screen the candidate browsing points in step S208, it may be further preferred to further reject the browsing points corresponding to the browsing point vector and the virtual camera line of sight OP that are greater than a specific angle value, for example, 45 degrees, 30 degrees, or 15 degrees, and not to be the candidate browsing points, so that a more preferred candidate browsing point may be obtained. Taking the embodiment described above in connection with fig. 6 as an example, the browsing point location R2 with a larger included angle between the browsing point location vector and the virtual camera sight line direction OP may be further removed, and the browsing points R4 and R5 are used as alternative browsing points, thereby reducing the calculation amount in step S209.
It should be noted that, in the drawings and the above description, the steps are described in the above sequence for the convenience of describing the present invention, but the method for selecting the browsing point location in the virtual space based on the external device of the present invention is not limited to the above sequence of steps. Specifically, for example, the steps S201, S203, and S206 and the steps performed subsequently thereto may be performed simultaneously without any particular order, or may be performed in random order. It should also be understood that the above-described method steps of the present invention are merely for the purpose of clearly illustrating the present invention, and thus are not intended to limit the present invention in any way. In other words, the above described method steps may be combined or further subdivided without essentially departing from the inventive concept and also fall within the scope of the claimed application. Specifically, for example, steps S203 and S204 may be combined and expressed as one step, and steps S207 and S208 may be combined and expressed as one step.
Fig. 7 shows a schematic block diagram of an apparatus 700 for selecting a browsing point location in a virtual space based on an external device according to an embodiment of the present invention, and the apparatus 700 is particularly applicable to various electronic devices. As shown in fig. 7, an apparatus 700 for selecting a browsing point location in a virtual space based on an external device according to an embodiment of the present invention includes: the signal acquisition module 701 acquires control signals of the external device, preferably continuously acquiring the control signals of the external device through polling; the current browsing point position confirming module 702 confirms the current browsing point position according to the control signal of the external device, and preferably confirms the current browsing point position when the signal acquisition module 701 polls the control signal value specifically input by the external device to reach a preset value; a virtual camera module 703 that constructs a virtual camera and updates a virtual scene according to a control signal of the external device; a browsing point position obtaining module 704, which obtains all preset browsing points in the virtual space; the alternative browsing point position initial selection module 705 constructs a view cone of the virtual camera and initially selects a browsing point position in the view cone as an alternative browsing point position; a distance calculation module 706, which calculates the distance between the alternative browsing point location and the current browsing point location; and a selection module 707 that selects the candidate browsing point location closest to the selected candidate browsing point location as the next browsing point location.
As shown in fig. 8, the virtual camera module 703 may further include a transformation matrix updating sub-module 7031 and a virtual scene updating sub-module 7032, where the transformation matrix updating sub-module 7031 updates a transformation matrix of the virtual camera in the virtual space according to a control signal of the external device, specifically, when one or more control signals of the external device are polled by the signal acquisition module 701 and are not zero, a product of a polling interval duration and a value of the control signal is calculated, and the product is used as a rotation angle value of the virtual camera, and updates the transformation matrix of the virtual camera in the virtual space according to the rotation angle value of the virtual camera; the virtual scene update sub-module 7032 updates the virtual scene according to the transformation matrix updated by the virtual camera. The virtual camera module 703 may construct a virtual camera through WebGL and update a virtual scene.
Referring to FIG. 9, there is shown a schematic diagram of a computer system that may be used to implement the apparatus of an embodiment of the present invention. It should be noted that the apparatus shown in fig. 9 is only an example, and should not be construed as having any limiting effect on the embodiments of the present application. The computer system shown in fig. 9 includes a Central Processing Unit (CPU) 901, which can execute various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage unit 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data required for the operation of the computer system are also stored. The CPU 901, ROM 902, and RAM 903 are connected to each other through a bus 904. An input/output (I/O) interface 905 is also connected to the bus 904.
The following components are connected to the I/O interface 905: an input unit 906 including a keyboard, a mouse, a microphone, a touch screen, and the like; an output unit 907 including a display screen such as a liquid crystal display, a light emitting diode display, or the like, a speaker, or the like; a storage unit 908 including a hard disk memory or the like; and a communication unit 909 including a network interface card such as a WAN/LAN card, modem, or the like. The communication unit 909 performs communication processing via a network such as the internet, a local area network, or the like. The drive 910 may also be connected to the I/O interface 905 as needed. Removable media 911 such as magnetic disks, optical disks, magneto-optical disks, semiconductor memories, and the like are mounted on the drive 910 as needed so that a computer program read therefrom is mounted into the storage unit 908 as needed.
In another aspect, the present application also provides a computer readable medium, which may be contained in the apparatus described in the above embodiments; or may be present alone without being assembled into the device. The computer readable medium carries one or more programs, which when executed by the apparatus, cause the apparatus to implement the steps of the method for selecting a browsing point location in a virtual space based on an external device according to the present invention.
In still another aspect, the present application further provides a computer program product, which includes computer instructions that, when executed by a processor, implement the steps of the method for selecting a browsing point location in a virtual space based on an external device according to the present invention.
In particular, the embodiments described above with reference to the flowcharts in the figures may be implemented as computer software programs. For example, embodiments disclosed herein include a computer program product comprising program instructions or code for performing the method of selecting a point of view in virtual space based on an external device of the present invention as shown in the flowcharts of the figures. In such an embodiment, the computer program may be downloaded and installed from a network via the communication unit 909 and/or installed from the removable medium 911. The method of the present invention is performed when the computer program is executed by a Central Processing Unit (CPU) 901.
It should be noted that the computer readable medium described in the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: computer magnetic disk, hard disk, random Access Memory (RAM), read Only Memory (ROM), erasable Programmable Read Only Memory (EPROM), flash memory, portable compact disc read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The above units or modules may also be provided in a processor, for example, described as: the processor comprises a signal acquisition module, a current browsing point position confirmation module, a virtual camera module, a virtual scene updating module, a browsing point position acquisition module, an alternative browsing point position primary selection module, a distance calculation module and a selection module. The names of these units or modules do not in some cases constitute a limitation of the unit or module itself, e.g. a signal acquisition module may also be described as a "module for acquiring signals".
All documents mentioned in this specification are incorporated by reference in this application as if each were fully incorporated by reference into this specification.
Further, it is understood that various changes or modifications of the present invention may be made by those skilled in the art after reading the above description of the invention, and such equivalents are intended to fall within the scope of the invention.