CN109284456B - Space positioning method of webpage control in AR and AR system - Google Patents

Space positioning method of webpage control in AR and AR system Download PDF

Info

Publication number
CN109284456B
CN109284456B CN201811000513.7A CN201811000513A CN109284456B CN 109284456 B CN109284456 B CN 109284456B CN 201811000513 A CN201811000513 A CN 201811000513A CN 109284456 B CN109284456 B CN 109284456B
Authority
CN
China
Prior art keywords
coordinates
browser
control
coordinate
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811000513.7A
Other languages
Chinese (zh)
Other versions
CN109284456A (en
Inventor
赵熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Yuewei Information Technology Co ltd
Original Assignee
Guangdong Yuewei Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Yuewei Information Technology Co ltd filed Critical Guangdong Yuewei Information Technology Co ltd
Priority to CN201811000513.7A priority Critical patent/CN109284456B/en
Publication of CN109284456A publication Critical patent/CN109284456A/en
Application granted granted Critical
Publication of CN109284456B publication Critical patent/CN109284456B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to the field of augmented reality, in particular to a method for positioning a webpage control in an AR (augmented reality), which comprises the following steps: acquiring the visible window coordinate of the control in the PC browser according to the page coordinate of the control on the webpage; calculating to obtain browser coordinates of the control in the browser according to the visible window coordinates; calculating to obtain the screen coordinates of the control on the PC display screen according to the browser coordinates and the origin of the browser coordinates; calculating to obtain real world coordinates of the control in the real world according to the screen coordinates; and calculating and obtaining the coordinates of the projection corresponding point of the control on the camera projection plane of the AR equipment by utilizing a preset three-dimensional registration algorithm according to the real world coordinates so as to complete space positioning. The invention also provides an AR system. According to the invention, the accurate space positioning of the webpage control in the AR is ensured by acquiring and transforming the coordinate information of the webpage control in a plurality of coordinate systems.

Description

Space positioning method of webpage control in AR and AR system
Technical Field
The invention relates to the field of augmented reality, in particular to a method for positioning a webpage control in an AR space and an AR system.
Background
Augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models, and aims to cover a virtual world on a screen in the real world and perform interaction.
With the development of AR technology, the interaction with the internet now becomes more and more by using AR. In web browsing, a web page usually has many controls, and for a common user, a web page with many text input boxes also has certain difficulty in accurately filling in each text input box. Especially, with the development of the augmented reality technology, a user may browse a web page by using AR glasses, and if the user wants to fill a text input box in the web page in an AR display interface, the user often has difficulty in filling due to a spatial positioning error.
Disclosure of Invention
Embodiments of the present invention are directed to solving at least one of the technical problems occurring in the prior art. Therefore, the embodiment of the invention needs to provide a method for positioning a webpage control in an AR space and an AR system.
The method for positioning the webpage control in the space in the AR is characterized by comprising the following steps:
step 1, acquiring the visible window coordinate of a control in a PC browser according to the page coordinate of the control on a webpage; the visual window coordinate takes the upper left corner of the visual window as the origin of coordinates;
step 2, calculating and obtaining browser coordinates of the control in the browser according to the coordinates of the visual window; the browser coordinates take the upper left corner of the browser as the origin of coordinates;
step 3, calculating to obtain the screen coordinate of the control on the PC display screen according to the browser coordinate and the origin of the browser coordinate;
step 4, calculating and obtaining real world coordinates of the control in the real world according to the screen coordinates;
and 5, calculating and obtaining the coordinates of the projection corresponding point of the control on the camera projection plane of the AR equipment by utilizing a preset three-dimensional registration algorithm according to the real world coordinates to complete space positioning.
In one embodiment, step 1 comprises:
step 11, judging whether the current visual window has no scroll bar to scroll;
step 12, if the judgment result in the step 11 is yes, the visible window coordinate (X) of the control is determinedVisible view,YVisible view) (X, Y); otherwise the visual window coordinate (X) of the controlVisible view,YVisible view)=(X-b,Y-a);
When X is presentVisible view<0、YVisible view<0、XVisible view> M or YVisible viewWhen the number is more than N, the control is in an invisible state in the visible window;
wherein (X, Y) is the page coordinate of the control, a represents the distance from the top when the right slider in the visual window slides, b represents the distance from the left when the lower slider in the visual window slides, M represents the maximum value of the X-axis coordinate of the visual window, and N represents the maximum value of the Y-axis coordinate of the visual window.
In one embodiment, step 2 comprises:
step 21, judging whether the browser is in a full screen display state;
step 22, if the judgment result in the step 21 is yes, the browser coordinate (X) of the control is judgedA Liu,YA Liu)=(XVisible view,YVisible view+hAll-purpose) (ii) a Browser coordinate (X) of control otherwiseA Liu,YA Liu)=(XVisible view,YVisible view+hIs not full);
Wherein h isAll-purposeThe height of a coordinate system of the browser in a visual window and full screen display state is represented; h isIs not fullAnd the height of the browser coordinate system in the visible window and non-full screen display states is represented.
In one embodiment, step 3 comprises: when the browser is determined to be in a full-screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu,YA Liu) (ii) a When the browser is determined to be in a non-full screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu+XO,YA Liu+YO);
Wherein (X)O,YO) And the coordinates of the origin of coordinates of the browser coordinate system in the screen coordinate system are represented.
In one embodiment, (X)Display device,YDisplay device0) coordinates of a virtual object in the real world, (C)x,Cy) Coordinates representing the projected corresponding point of the control on the camera projection plane of the AR device, step 5 comprises:
calculated by the following formula (C)x,Cy) To accomplish spatial localization:
Figure BDA0001782858460000031
wherein,
Figure BDA0001782858460000032
coefficient matrix representing internal parameters of the camera, f focal length of the camera, SxAnd SyRepresents a scale factor, u0And v0Coordinates representing where the center of gravity of the acquired image is located;
Figure BDA0001782858460000033
a transformation matrix representing a transformation of the real world coordinate system to the camera coordinate system.
The invention also provides an AR system, which comprises a PC end and an AR end which are connected with each other, and is characterized in that the PC end comprises a first calculation module, a second calculation module, a third calculation module and a fourth calculation module, and the AR end comprises an AR calculation module;
the first calculation module is used for acquiring the visible window coordinates of the control in the PC browser according to the page coordinates of the control on the webpage; the visual window coordinate takes the upper left corner of the visual window as the origin of coordinates;
the second calculation module is used for calculating browser coordinates of the control in the browser according to the visible window coordinates; the browser coordinates take the upper left corner of the browser as the origin of coordinates;
the third calculation module is used for calculating and obtaining the screen coordinates of the control on the PC display screen according to the browser coordinates and the origin of the browser coordinates;
the fourth calculation module is used for calculating and obtaining real world coordinates of the control in the real world according to the screen coordinates;
and the AR calculation module is used for calculating and obtaining the coordinates of the projection corresponding points of the control on the camera projection plane of the AR equipment by utilizing a preset three-dimensional registration algorithm according to the real world coordinates so as to complete space positioning.
In one embodiment, the first computing module comprises:
the first judgment unit is used for judging whether the current visual window does not have any scroll bar to scroll;
a first calculating unit, configured to determine the visible window coordinate (X) of the control if the determination result of the first determining unit is yesVisible view,YVisible view) (X, Y); otherwise the visual window coordinate (X) of the controlVisible view,YVisible view)=(X-b,Y-a);
When X is presentVisible view<0、YVisible view<0、XVisible view> M or YVisible viewWhen the number is more than N, the control is in an invisible state in the visible window;
wherein (X, Y) is the page coordinate of the control, a represents the distance from the top when the right slider in the visual window slides, b represents the distance from the left when the lower slider in the visual window slides, M represents the maximum value of the X-axis coordinate of the visual window, and N represents the maximum value of the Y-axis coordinate of the visual window.
In one embodiment, the second computing module comprises:
the second judgment unit is used for judging whether the browser is in a full-screen display state or not;
a second calculating unit for calculating the browser coordinate (X) of the control if the judgment result of the second judging unit is positiveA Liu,YA Liu)=(XVisible view,YVisible view+hAll-purpose) (ii) a Browser coordinate (X) of control otherwiseA Liu,YA Liu)=(XVisible view,YVisible view+hIs not full);
Wherein h isAll-purposeThe height of a coordinate system of the browser in a visual window and full screen display state is represented; h isIs not fullAnd the height of the browser coordinate system in the visible window and non-full screen display states is represented.
In one embodiment, the third computing module is specifically configured to determine a screen coordinate (X) when the browser is in a full screen display stateScreen (B),YScreen (B))=(XA Liu,YA Liu) (ii) a When the browser is determined to be in a non-full screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu+XO,YA Liu+YO) (ii) a Wherein (X)O,YO) And the coordinates of the origin of coordinates of the browser coordinate system in the screen coordinate system are represented.
In one embodiment, (X)Display device,YDisplay device0) coordinates of a virtual object in the real world, (C)x,Cy) The coordinate of the projection corresponding point of the control on the projection plane of the camera of the AR device is represented, and the AR calculation module calculates (C) according to the following formulax,Cy) To accomplish spatial localization:
Figure BDA0001782858460000041
wherein,
Figure BDA0001782858460000042
coefficient matrix representing internal parameters of the camera, f focal length of the camera, SxAnd SyRepresents a scale factor, u0And v0Coordinates representing where the center of the acquired image is located;
Figure BDA0001782858460000051
a transformation matrix representing a transformation of the real world coordinate system to the camera coordinate system.
According to the space positioning method of the webpage control in the AR and the AR system, accurate space positioning of the webpage control in the AR is guaranteed through obtaining and transforming the coordinate information of the webpage control in a plurality of coordinate systems.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of embodiments of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of a coordinate system of an embodiment of the present invention;
FIG. 2 is a schematic flowchart of a method for spatially positioning a web page control in an AR according to an embodiment of the present invention;
fig. 3 is a schematic diagram of the composition of the AR system according to the embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are exemplary only and should not be construed as limiting the embodiments of the present invention.
As shown in FIG. 1, the invention takes the upper left corner of the display screen as the origin, extends rightwards to be the positive X direction, extends downwards to be the positive Y direction, and points to the inside of the display to be the positive Z direction.
The establishment of the real world coordinate system xyz is described next. As shown in FIG. 1, the uvw coordinate system is the coordinate system of the virtual world in AR, the x ' y ' z ' coordinate system is the camera coordinate system, VxVyIs the coordinate system of the camera projection plane, the display plane is in the real world coordinate system xoy plane, where z is 0.
Referring to fig. 2, a method for positioning a web page control in an AR according to an embodiment of the present invention includes:
step 1, acquiring the visible window coordinate of a control in a PC browser according to the page coordinate of the control on a webpage; the visual window coordinates take the upper left corner of the visual window as the origin of coordinates.
Step 2, calculating and obtaining browser coordinates of the control in the browser according to the coordinates of the visual window; the browser coordinates have the top left corner of the browser as the origin of coordinates.
And 3, calculating to obtain the screen coordinate of the control on the PC display screen according to the browser coordinate and the origin of the browser coordinate.
And 4, calculating to obtain the real world coordinates of the control in the real world according to the screen coordinates.
And 5, calculating and obtaining the coordinates of the projection corresponding point of the control on the camera projection plane of the AR equipment by utilizing a preset three-dimensional registration algorithm according to the real world coordinates to complete space positioning.
As shown in fig. 3, the present invention further provides an AR system, which includes a PC end and an AR end connected to each other, where the PC end includes a first calculation module, a second calculation module, a third calculation module, and a fourth calculation module, and the AR end includes an AR calculation module. The introduction of each module is as follows:
the first calculation module is used for acquiring the visible window coordinates of the control in the PC browser according to the page coordinates of the control on the webpage; the visual window coordinates take the upper left corner of the visual window as the origin of coordinates.
The second calculation module is used for calculating browser coordinates of the control in the browser according to the visible window coordinates; the browser coordinates have the top left corner of the browser as the origin of coordinates.
And the third calculation module is used for calculating and obtaining the screen coordinates of the control on the PC display screen according to the browser coordinates and the origin of the browser coordinates.
And the fourth calculation module is used for calculating and obtaining the real world coordinates of the control in the real world according to the screen coordinates.
And the AR calculation module is used for calculating and obtaining the coordinates of the projection corresponding points of the control on the camera projection plane of the AR equipment by utilizing a preset three-dimensional registration algorithm according to the real world coordinates so as to complete space positioning.
In this embodiment, the method for spatially positioning the web page control in the AR uses the AR system as an execution object of the step, and may also use a module in the system as an execution object of the step. Specifically, step 1 is executed by a first calculation module, step 2 is executed by a second calculation module, step 3 is executed by a third calculation module, step 4 is executed by a fourth calculation module, and step 5 is executed by an AR calculation module.
In step 1, for example, the coordinates of the control are (X, Y), the distance from the top when the right slider slides is a, and the distance from the left when the lower slider slides is b. The size of the visible window is M x N. The first computing module may obtain the page coordinates of the control on the web page by:
x ═ control id.,. offset (). left
Y ═ control id.,. offset (). top
a=$(document).scrollTop()
b=$(document).scrollLeft()
Due to the existence of the scroll bar in the webpage, whether the scroll bar scrolls or not influences the visible window coordinates of the control in the PC browser.
Specifically, step 1 comprises:
and 11, judging whether the current visual window does not have any scroll bar for scrolling.
Step 12, if the judgment result in the step 11 is yes, the visible window coordinate (X) of the control is determinedVisible view,YVisible view) (X, Y); otherwise the visual window coordinate (X) of the controlVisible view,YVisible view)=(X-b,Y-a)。
When X is presentVisible view<0、YVisible view<0、XVisible view> M or YVisible viewAnd when the number is more than N, the control is in an invisible state in the visible window. M represents the maximum value of the X-axis coordinate of the visual window, and N represents the maximum value of the Y-axis coordinate of the visual window.
Accordingly, the first calculation module in the AR system includes:
and the first judgment unit is used for judging whether the current visual window does not have any scroll bar to scroll.
A first calculating unit, configured to determine the visible window coordinate (X) of the control if the determination result of the first determining unit is yesVisible view,YVisible view) (X, Y); otherwise the visual window coordinate (X) of the controlVisible view,YVisible view)=(X-b,Y-a)。
When X is presentVisible view<0、YVisible view<0、XVisible view> M or YVisible viewAnd when the number is more than N, the control is in an invisible state in the visible window.
Wherein (X, Y) is the page coordinate of the control, a represents the distance from the top when the right slider in the visual window slides, b represents the distance from the left when the lower slider in the visual window slides, M represents the maximum value of the X-axis coordinate of the visual window, and N represents the maximum value of the Y-axis coordinate of the visual window.
Step 11 may be performed by the first determination unit, and step 12 may be performed by the first calculation unit. The visual window coordinates of the control are calculated and obtained according to different processes according to different conditions of whether the scroll bar scrolls in the current visual window or not.
In step 2, the browser coordinates use the top left corner of the browser as the origin of coordinates, so whether the browser is in a full screen display state or not can directly affect the browser coordinates of the control. The method comprises the following specific steps:
the step 2 comprises the following steps:
and step 21, judging whether the browser is in a full screen display state.
Step 22, if the judgment result in the step 21 is yes, the browser coordinate (X) of the control is judgedA Liu,YA Liu)=(XVisible view,YVisible view+hAll-purpose) (ii) a Browser coordinate (X) of control otherwiseA Liu,YA Liu)=(XVisible view,YVisible view+hIs not full)。
Wherein h isAll-purposeThe height of a coordinate system of the browser in a visual window and full screen display state is represented; h isIs not fullAnd the height of the browser coordinate system in the visible window and non-full screen display states is represented.
Accordingly, the second calculation module in the AR system includes:
and the second judging unit is used for judging whether the browser is in a full-screen display state.
A second calculating unit for calculating the browser coordinate (X) of the control if the judgment result of the second judging unit is positiveA Liu,YA Liu)=(XVisible view,YVisible view+hAll-purpose) (ii) a Browser coordinate (X) of control otherwiseA Liu,YA Liu)=(XVisible view,YVisible view+hIs not full)。
Wherein h isAll-purposeThe height of a coordinate system of the browser in a visual window and full screen display state is represented; h isIs not fullAnd the height of the browser coordinate system in the visible window and non-full screen display states is represented.
This height can be calculated with reference to the following formula:
h=window.outerHeight-window.innerHeight
that is, the outer height of a window is set or returned by using the outheight attribute in JavaScript, and the height of the content display area of the window is set or returned by using the innerheight attribute.
Step 21 may be performed by the second determination unit, and step 22 may be performed by the second calculation unit. The browser coordinates of the control are calculated and obtained according to different processes according to different conditions of whether the browser is in a full-screen display state or not.
In step 3, the method specifically comprises the following steps: when the browser is determined to be in a full-screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu,YA Liu) (ii) a When the browser is determined to be in a non-full screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu+XO,YA Liu+YO) (ii) a Wherein (X)O,YO) And the coordinates of the origin of coordinates of the browser coordinate system in the screen coordinate system are represented.
Correspondingly, the third computing module is specifically configured to determine the screen coordinate (X) when the browser is in a full-screen display stateScreen (B),YScreen (B))=(XA Liu,YA Liu) (ii) a When the browser is determined to be in a non-full screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu+XO,YA Liu+YO) (ii) a Wherein (X)O,YO) And the coordinates of the origin of coordinates of the browser coordinate system in the screen coordinate system are represented.
The origin of coordinates of the screen coordinate system is at the upper left corner of the screen, i.e. when the browser is full screen, the browser coordinate system coincides with the screen coordinate system, the screen coordinates (X)Screen (B),YScreen (B))=(XA Liu,YA Liu)。
When the browser is not the full screen, the coordinate of the origin of the browser coordinate in the screen coordinate system is set as (X)O,YO) Then the control is at screen coordinate (X) at this timeScreen (B),YScreen (B)) Browser coordinates (X)A Liu+XO,YA Liu+YO)。
Wherein XO,YOThe method comprises the following steps:
XO=window.screenLeft
YO=window.screenTop
the X and Y coordinates of the window relative to the screen are returned using the screen left and screen top attributes in JavaScript.
In step 4, a specific example of the process of obtaining real world coordinates of the control in the real world according to the screen coordinate calculation is described. Taking a 23.6 inch display (16:9) as an example, the screen diagonal length DL is 23.6 × 2.54 — 59.944 cm.
The calculation of Pythagorean theorem is as follows:
the display screen length DW is 52.246cm.
The display screen width DH is 29.388cm.
The screen resolution is SW SH.
Wherein SW, SH are obtained by the following method
SW=screen.width
SH=screen.height
Setting the pixel coordinate (X) of the control on the screenScreen (B),YScreen (B)) Since the control is on the display plane and on the real world coordinate system xoy plane, and z is 0, the coordinate of the control on the display screen is (X)Display device,YDisplay device0), in centimeters.
Figure BDA0001782858460000101
Figure BDA0001782858460000102
The above is an example of a 23.6 inch display, with the ratio (m: n) if the display is d inches. The diagonal length DL of the screen is d 2.54d (cm).
The calculation of Pythagorean theorem is as follows:
①DW2+DH2=DL2
②DW:DH=m:n
③DL=2.54d
through the processes of (I), (II)
Figure BDA0001782858460000111
Figure BDA0001782858460000112
The coordinates of the control in the real world coordinate system are set to (X)Display device,YDisplay device0), in centimeters.
Figure BDA0001782858460000113
Figure BDA0001782858460000114
In step 5, (X)Display device,YDisplay device0) coordinates of a virtual object in the real world, (C)x,Cy) Coordinates representing the projected corresponding point of the control on the camera projection plane of the AR device, step 5 comprises:
the AR calculation module calculates (C) according to the following formulax,Cy) To accomplish spatial localization:
Figure BDA0001782858460000115
wherein,
Figure BDA0001782858460000116
coefficient matrix representing internal parameters of the camera, f focal length of the camera, SxAnd SyRepresents a scale factor, u0And v0Representing the center coordinates of the feature picture. The entire internal parameter coefficient matrix of the camera can be derived by this algorithm, for example by the zhangnyou calibration algorithm.
Figure BDA0001782858460000117
Representing a transformation of the real world coordinate system into the camera coordinate systemAnd (5) matrix changing.
The coordinate value of the control under the real world coordinate system is obtained through the step 4, so that the virtual object can be accurately superposed on the control through a three-dimensional registration algorithm for use (X)Display device,YDisplay device0) represents the coordinates of the virtual object in the real world. Generally, the method adopted by the projection of the three-dimensional virtual object onto the projection plane is a perspective projection transformation mode. Using the three-dimensional registration algorithm based on the feature image as an example, the coordinates (X) of the virtual object in the real worldDisplay device,YDisplay device0) projection of the corresponding point (C) on the projection plane of the camerax,Cy) The relationship therebetween satisfies the coordinate conversion relationship as above.
In particular, the amount of the solvent to be used,
Figure BDA0001782858460000121
is a transformation matrix of the real world coordinate system to the camera coordinate system,
Figure BDA0001782858460000122
in order to be a matrix of rotations,
Figure BDA0001782858460000123
since the camera coordinate system is on the camera of the AR glasses for translating the matrix, the camera coordinate system is constantly changing when the user wears the AR glasses, and therefore the R, T matrix needs to be calculated in real time for real-time tracking registration. Wherein r is11、r12、r13、r21、r22、r23、r31、r32、r33Represented as only one element of the rotation matrix R,
matrix TwIn (1)
Figure BDA0001782858460000124
Representing a translation component, assuming that the origin of the camera coordinate system is moved by a distance a in the x-axis direction, a distance b in the y-axis direction, and a distance c in the z-axis direction
Figure BDA0001782858460000125
Assuming an initial rotation matrix of the origin of the camera coordinate system as
Figure BDA0001782858460000126
When the origin of the camera coordinate system is rotated by an angle theta around the x-axis of the real world coordinate system,
Figure BDA0001782858460000127
namely RxRepresenting a rotation matrix rotated only by an angle theta in the x-axis and not rotated in the y-axis and z-axis.
When the origin of the camera coordinate system is rotated by an angle theta around the y-axis of the real world coordinate system,
Figure BDA0001782858460000128
namely RyRepresenting a rotation matrix rotated only by an angle theta in the y-axis and not rotated in the x-axis and z-axis.
When the origin of the camera coordinate system is rotated by an angle theta around the z-axis of the real world coordinate system,
Figure BDA0001782858460000131
namely RzRepresenting a rotation matrix rotated only in the z-axis by an angle theta and not in the x-and y-axes.
Any angular rotation can be decomposed into rotation components on the x-axis, the y-axis and the z-axis, and only matrices rotating around different axes need to be multiplied in sequence, namely R is Rx*Ry*Rz. R is the true rotation matrix of the camera relative to the real world coordinate system.
According to the space positioning method of the webpage control in the AR and the AR system, accurate space positioning of the webpage control in the AR is guaranteed through obtaining and transforming the coordinate information of the webpage control in a plurality of coordinate systems.
In the description of the embodiments of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the embodiments of the present invention and simplifying the description, but do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the embodiments of the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
In the description of the embodiments of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as being fixedly connected, detachably connected, or integrally connected; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. Specific meanings of the above terms in the embodiments of the present invention can be understood by those of ordinary skill in the art according to specific situations.
In embodiments of the invention, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise the first and second features being in direct contact, or the first and second features being in contact, not directly, but via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The following disclosure provides many different embodiments or examples for implementing different configurations of embodiments of the invention. In order to simplify the disclosure of embodiments of the invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, embodiments of the invention may repeat reference numerals and/or reference letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements discussed. In addition, embodiments of the present invention provide examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of embodiments of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (6)

1. A method for spatially positioning a web page control in an AR (augmented reality), comprising:
step 1, acquiring the visible window coordinate of a control in a PC browser according to the page coordinate of the control on a webpage; the visual window coordinate takes the upper left corner of the visual window as the origin of coordinates;
step 2, calculating and obtaining browser coordinates of the control in the browser according to the coordinates of the visual window; the browser coordinates take the upper left corner of the browser as the origin of coordinates;
step 3, calculating to obtain the screen coordinate of the control on the PC display screen according to the browser coordinate and the origin of the browser coordinate;
step 4, calculating and obtaining real world coordinates of the control in the real world according to the screen coordinates;
step 5, calculating and obtaining coordinates of projection corresponding points of the control on a camera projection plane of the AR equipment by utilizing a preset three-dimensional registration algorithm according to real world coordinates so as to complete space positioning;
wherein, step 1 includes:
step 11, judging whether the current visual window has no scroll bar to scroll;
step 12, if the judgment result in the step 11 is yes, the visible window coordinate (X) of the control is determinedVisible view,YVisible view) (X, Y); otherwise the visual window coordinate (X) of the controlVisible view,YVisible view)=(X-b,Y-a);
When X is presentVisible view<0、YVisible view<0、XVisible view>M or YVisible view>When N is needed, the control is in an invisible state in the visible window;
wherein (X, Y) is the page coordinate of the control, a represents the distance from the top when the right sliding bar slides in the visual window, b represents the distance from the left sliding bar to the left sliding bar in the visual window, M represents the maximum value of the X-axis coordinate of the visual window, and N represents the maximum value of the Y-axis coordinate of the visual window;
and step 2 comprises:
step 21, judging whether the browser is in a full screen display state;
step 22, if the judgment result in the step 21 is yes, the browser coordinate (X) of the control is judgedA Liu,YA Liu)=(XVisible view,YVisible view+hAll-purpose) (ii) a Browser coordinate (X) of control otherwiseA Liu,YA Liu)=(XVisible view,YVisible view+hIs not full);
Wherein h isAll-purposeThe height of a coordinate system of the browser in a visual window and full screen display state is represented; h isIs not fullAnd the height of the browser coordinate system in the visible window and non-full screen display states is represented.
2. The method for spatially positioning a web page control in an AR as claimed in claim 1, wherein step 3 comprises: when the browser is determined to be in a full-screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu,YA Liu) (ii) a When the browser is determined to be in a non-full screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu+XO,YA Liu+YO) (ii) a Wherein (X)O,YO) And the coordinates of the origin of coordinates of the browser coordinate system in the screen coordinate system are represented.
3. The method of claim 2, wherein (X) is a spatial location of a web page control in ARDisplay device,YDisplay device0) coordinates of a virtual object in the real world, (C)x,Cy) Coordinates representing the projected corresponding point of the control on the camera projection plane of the AR device, step 5 comprises:
calculated by the following formula (C)x,Cy) To accomplish spatial localization:
Figure FDF0000015835890000021
wherein,
Figure FDF0000015835890000022
coefficient matrix representing internal parameters of the camera, f focal length of the camera, SxAnd SyRepresents a scale factor, u0And v0Coordinates representing where the center of gravity of the acquired image is located;
Figure FDF0000015835890000023
representing realityA transformation matrix for transforming the world coordinate system to the camera coordinate system.
4. An AR system comprises a PC end and an AR end which are connected with each other, and is characterized in that the PC end comprises a first calculation module, a second calculation module, a third calculation module and a fourth calculation module, and the AR end comprises an AR calculation module;
the first calculation module is used for acquiring the visible window coordinates of the control in the PC browser according to the page coordinates of the control on the webpage; the visual window coordinate takes the upper left corner of the visual window as the origin of coordinates;
the second calculation module is used for calculating browser coordinates of the control in the browser according to the visible window coordinates; the browser coordinates take the upper left corner of the browser as the origin of coordinates;
the third calculation module is used for calculating and obtaining the screen coordinates of the control on the PC display screen according to the browser coordinates and the origin of the browser coordinates;
the fourth calculation module is used for calculating and obtaining real world coordinates of the control in the real world according to the screen coordinates;
the AR calculation module is used for calculating and obtaining coordinates of projection corresponding points of the control on a camera projection plane of the AR equipment by utilizing a preset three-dimensional registration algorithm according to real world coordinates so as to complete space positioning;
wherein, the first calculation module comprises:
the first judgment unit is used for judging whether the current visual window does not have any scroll bar to scroll;
a first calculating unit, configured to determine the visible window coordinate (X) of the control if the determination result of the first determining unit is yesVisible view,YVisible view) (X, Y); otherwise the visual window coordinate (X) of the controlVisible view,YVisible view)=(X-b,Y-a);
When X is presentVisible view<0、YVisible view<0、XVisible view>M or YVisible view>When N is needed, the control is in an invisible state in the visible window;
wherein (X, Y) is the page coordinate of the control, a represents the distance from the top when the right sliding bar slides in the visual window, b represents the distance from the left sliding bar to the left sliding bar in the visual window, M represents the maximum value of the X-axis coordinate of the visual window, and N represents the maximum value of the Y-axis coordinate of the visual window;
and the second computing module comprises:
the second judgment unit is used for judging whether the browser is in a full-screen display state or not;
a second calculating unit for calculating the browser coordinate (X) of the control if the judgment result of the second judging unit is positiveA Liu,YA Liu)=(XVisible view,YVisible view+hAll-purpose) (ii) a Browser coordinate (X) of control otherwiseA Liu,YA Liu)=(XVisible view,YVisible view+hIs not full);
Wherein h isAll-purposeThe height of a coordinate system of the browser in a visual window and full screen display state is represented; h isIs not fullAnd the height of the browser coordinate system in the visible window and non-full screen display states is represented.
5. The AR system of claim 4, wherein the third computing module is further configured to determine screen coordinates (X) when the browser is in a full screen display stateScreen (B),YScreen (B))=(XA Liu,YA Liu) (ii) a When the browser is determined to be in a non-full screen display state, the screen coordinate (X)Screen (B),YScreen (B))=(XA Liu+XO,YA Liu+YO) (ii) a Wherein (X)O,YO) And the coordinates of the origin of coordinates of the browser coordinate system in the screen coordinate system are represented.
6. The AR system of claim 5, wherein (X) isDisplay device,YDisplay device0) coordinates of a virtual object in the real world, (C)x,Cy) The coordinate of the projection corresponding point of the control on the projection plane of the camera of the AR device is represented, and the AR calculation module calculates (C) according to the following formulax,Cy) To accomplish spatial localization:
Figure FDF0000015835890000041
wherein,
Figure FDF0000015835890000042
coefficient matrix representing internal parameters of the camera, f focal length of the camera, SxAnd SyRepresents a scale factor, u0And v0Coordinates representing where the center of the acquired image is located;
Figure FDF0000015835890000043
a transformation matrix representing a transformation of the real world coordinate system to the camera coordinate system.
CN201811000513.7A 2018-08-30 2018-08-30 Space positioning method of webpage control in AR and AR system Active CN109284456B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811000513.7A CN109284456B (en) 2018-08-30 2018-08-30 Space positioning method of webpage control in AR and AR system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811000513.7A CN109284456B (en) 2018-08-30 2018-08-30 Space positioning method of webpage control in AR and AR system

Publications (2)

Publication Number Publication Date
CN109284456A CN109284456A (en) 2019-01-29
CN109284456B true CN109284456B (en) 2022-04-12

Family

ID=65183649

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811000513.7A Active CN109284456B (en) 2018-08-30 2018-08-30 Space positioning method of webpage control in AR and AR system

Country Status (1)

Country Link
CN (1) CN109284456B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111311687B (en) * 2020-01-21 2022-12-02 上海万物新生环保科技集团有限公司 Method and equipment for detecting spatial position of mobile phone screen pixel point

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102253819A (en) * 2011-08-08 2011-11-23 深圳超多维光电子有限公司 Multimedia display method and system based on browser
CN104833360A (en) * 2014-02-08 2015-08-12 无锡维森智能传感技术有限公司 Method for transforming two-dimensional coordinates into three-dimensional coordinates
US9196067B1 (en) * 2013-03-05 2015-11-24 Amazon Technologies, Inc. Application specific tracking of projection surfaces
CN106373085A (en) * 2016-09-20 2017-02-01 福州大学 Intelligent terminal 3D watch try-on method and system based on augmented reality
CN107392961A (en) * 2017-06-16 2017-11-24 华勤通讯技术有限公司 Space-location method and device based on augmented reality
CN108304075A (en) * 2018-02-11 2018-07-20 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1558343A (en) * 2004-01-30 2004-12-29 中国科学院计算技术研究所 Three dimensional resource browser and manager and method thereof
CN101923462A (en) * 2009-06-10 2010-12-22 成都如临其境创意科技有限公司 FlashVR-based three-dimensional mini-scene network publishing engine
CN102289486A (en) * 2011-08-08 2011-12-21 深圳超多维光电子有限公司 Picture display method and system based on browser
KR101380854B1 (en) * 2013-03-21 2014-04-04 한국과학기술연구원 Apparatus and method providing augmented reality contents based on web information structure
WO2016203792A1 (en) * 2015-06-15 2016-12-22 ソニー株式会社 Information processing device, information processing method, and program
CN105893602B (en) * 2016-04-21 2019-11-05 北京京东尚科信息技术有限公司 Full screen display process and system for chart in the webpage of terminal browser
CN107070784A (en) * 2017-06-14 2017-08-18 李云 A kind of 3D instant communicating systems based on WebGL and VR technologies

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102253819A (en) * 2011-08-08 2011-11-23 深圳超多维光电子有限公司 Multimedia display method and system based on browser
US9196067B1 (en) * 2013-03-05 2015-11-24 Amazon Technologies, Inc. Application specific tracking of projection surfaces
CN104833360A (en) * 2014-02-08 2015-08-12 无锡维森智能传感技术有限公司 Method for transforming two-dimensional coordinates into three-dimensional coordinates
CN106373085A (en) * 2016-09-20 2017-02-01 福州大学 Intelligent terminal 3D watch try-on method and system based on augmented reality
CN107392961A (en) * 2017-06-16 2017-11-24 华勤通讯技术有限公司 Space-location method and device based on augmented reality
CN108304075A (en) * 2018-02-11 2018-07-20 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Real-time Projection Method for Augmented Reality Assisted Assembly;Guo Hong=jie;《Proceedings of 2018 International Forum on Construction,Aviation and Environmental Engineering-Internet of Things》;20180731;300-305 *
虚拟现实的人机交互综述;张凤军;《中国科学:信息科学》;20161220;第46卷(第12期);1711-1736 *

Also Published As

Publication number Publication date
CN109284456A (en) 2019-01-29

Similar Documents

Publication Publication Date Title
JP6186415B2 (en) Stereoscopic image display method and portable terminal
US20100208057A1 (en) Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
CN116362971A (en) System and method for generating a post-stitched panoramic view
CN105825544A (en) Image processing method and mobile terminal
JP6096634B2 (en) 3D map display system using virtual reality
US11477432B2 (en) Information processing apparatus, information processing method and storage medium
US11651556B2 (en) Virtual exhibition space providing method for efficient data management
RU2612572C2 (en) Image processing system and method
CN105391938A (en) Image processing apparatus, image processing method, and computer program product
US11250643B2 (en) Method of providing virtual exhibition space using 2.5-dimensionalization
CN110084797B (en) Plane detection method, plane detection device, electronic equipment and storage medium
CN107222737B (en) A kind of processing method and mobile terminal of depth image data
EP2991323A1 (en) Mobile device and method of projecting image by using the mobile device
CN110532497B (en) Method for generating panorama, method for generating three-dimensional page and computing device
US9802539B2 (en) Distance and direction estimation of a target point from a vehicle using monocular video camera
US20210208398A1 (en) Head mounted display apparatus, virtual reality display system and driving method thereof
CN109284456B (en) Space positioning method of webpage control in AR and AR system
JP5461782B2 (en) Camera image simulator program
CN108269288B (en) Intelligent special-shaped projection non-contact interaction system and method
CN113689508A (en) Point cloud marking method and device, storage medium and electronic equipment
JP6314672B2 (en) Display processing apparatus, display processing method, and program
CN110442314B (en) Display method, terminal and computer readable storage medium
KR101082545B1 (en) Mobile communication terminal had a function of transformation for a picture
CN108702465B (en) Method and apparatus for processing images in virtual reality system
CN116128744A (en) Method for eliminating image distortion, electronic device, storage medium and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 510000 room 713, 7th floor, No. 8, Huangzhou Industrial Zone, chepo Road, Dongpu, Guangzhou, Guangdong

Applicant after: Guangdong Yuewei Information Technology Co.,Ltd.

Address before: 510000 room 713, floor 7, No. 8, Huangzhou Industrial Zone, chepo Road, Dongpu, Tianhe District, Guangzhou, Guangdong

Applicant before: GUANGZHOU YUWAY INFORMATION TECHNOLOGY CO.,LTD.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant