CN110956702A - 3D visual editor and editing method based on time axis - Google Patents

3D visual editor and editing method based on time axis Download PDF

Info

Publication number
CN110956702A
CN110956702A CN201911130859.3A CN201911130859A CN110956702A CN 110956702 A CN110956702 A CN 110956702A CN 201911130859 A CN201911130859 A CN 201911130859A CN 110956702 A CN110956702 A CN 110956702A
Authority
CN
China
Prior art keywords
time
information
time axis
glasses
time node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911130859.3A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Cuitai Intelligent Technology Co ltd
Original Assignee
Shanghai Cuitai Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Cuitai Intelligent Technology Co ltd filed Critical Shanghai Cuitai Intelligent Technology Co ltd
Priority to CN201911130859.3A priority Critical patent/CN110956702A/en
Publication of CN110956702A publication Critical patent/CN110956702A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Abstract

The invention discloses a 3D visual editor based on a time axis, which at least comprises XR glasses, wherein the XR glasses are used for displaying at least one time axis; marking at least one time node on the time axis; at least one virtual electronic tag is created on the time node; and adding at least one piece of editable information related to the time node into the virtual electronic tag. In the 3D era, 3D has brought a series of changes, 3D display, 3D interaction, and 3D editing. The 3D visual editor based on the time axis can effectively improve the efficiency and effect of space three-dimensional editing, and enables only a few people to enjoy convenience brought by the editing state presented in the brain.

Description

3D visual editor and editing method based on time axis
Technical Field
The invention relates to the field of editing tools of computer technology, in particular to a 3D visual editor and an editing method based on a time axis.
Background
Existing editing tools/software, due to the limitation of computer display hardware, often present a 2D (two-dimensional) manner, even though 3D editing software also displays various perspective manners on a 2D display, such as a classic perspective divided into a one-point perspective, a two-point perspective and a three-point perspective.
The large piece of hollywood (A Va Da) of the heat reflection wins the good comment of global film fans with the 3D (three-dimensional) effect of being personally on the scene, raises a 3D film heat tide, and simultaneously enables a plurality of people who have doubtful attitudes to the 3D technology to become supporters of the 3D technology, thereby enabling people to be full of endless directions to the 3D family film and television life. 3D technology, wandering outside the mainstream market, has finally rung the horn of the army-marching home entertainment industry.
3D is an abbreviation of three-dimension, i.e., a three-dimensional stereoscopic image. This technical principle is not complex in nature. A human being observes a slight difference of an object by the left and right eyes to perceive the existence state of the object in a three-dimensional space. On the screen, as long as the picture seen by the left eye and the right eye is not an image, the whole picture can be stereoscopic.
3D technology has been developed more rapidly for commercial use. More and more 3D stereoscopic cinemas will invest in construction, and the number of 3D movies is also increasing greatly. While the 3D technology is rapidly developed, the 3D technology shows unprecedented strengths in the field of electronic consumer by the interpenetration of movie, music, animation, and network games, and the trend of using the 3D technology in consumer electronics is very obvious.
VR is known as Virtual Reality, also known as smart technology, and has immersive, interactive, and conceptual features. The VR technology integrates multiple technologies such as computer graphics, simulation technology, multimedia technology, artificial intelligence technology, computer network technology, parallel processing technology, multi-sensor technology and the like, and simulates functions of sense organs such as vision, hearing, touch and the like of a person, so that the person is immersed in a virtual world generated by a computer if the person is personally on the scene, real-time communication can be carried out through language, gestures and the like, and the entrance feeling and the immersion feeling are enhanced. Through VR technique, let the people experience real world lifelike, can also break through the condition limitations such as space-time, experience the wonderful experience of getting into the virtual world.
AR is known as Augmented Reality technology. The technology is that virtual information is overlaid to the real world by utilizing a computer technology, is displayed through a mobile phone, a tablet personal computer and other equipment, and is perceived by people, so that the real world and the virtual world are greatly integrated, and the real world is enriched. In short, the content of the plane of the user is 'alive', more information is given to the real object, the stereoscopic impression is enhanced, and the visual effect and the interactive experience feeling are enhanced.
MR, known as Mixed Reality, is a further development of virtual Reality technology. The method is characterized in that real scene information is introduced into a virtual environment, and a bridge for interactive feedback information is built among a virtual world, the real world and a user, so that the sense of reality of user experience is enhanced. The key point of MR technology is the interaction with the real world and the timely acquisition of information, and therefore its implementation needs to be in an environment where it can interact with the real world things. If the environments are all virtual, then it is VR; if the virtual information presented is simply a simple overlay with a virtual thing, then it is an AR. MR and AR distinction, in simple terms: the AR only manages the superposition of the virtual environment without managing the reality, but the MR can enable a user to see the reality which cannot be seen by naked eyes through a camera.
The VR, AR, MR are collectively referred to as XR. In addition, 3D digital cameras and 3D digital photo frames have been commercialized. The 3D digital camera can superimpose the same scene through the dual lenses to generate a stereoscopic image, or perform contrast of different color modes by using a 3D function. The 3D digital photo frame utilizes the principle of polarization to simultaneously refract different images to two eyes, and even can be used as 3D glasses to make common photos three-dimensionally. In recent years, the first global high definition 3D camera is developed by panasonic corporation, and the design of the dual lens thereof can be said to be very unique. The lens of the camera, the front end of the camera and the double memory card memory are integrated by loosening, so that the camera body is lighter and more flexible when a user carries out handheld shooting. Meanwhile, the machine can automatically correct the image and can directly record the 3D image without using any additional equipment. The binocular AR (microsoft self-call MR) head display device was developed since a long time by microsoft corporation.
The present application, inventive in application No. CN201710348474.9, provides an editing apparatus of three-dimensional shape data, a recording medium storing a three-dimensional shape data editing program, and a recording medium storing a data structure of three-dimensional shape data, the apparatus including a range setting unit that sets at least one of a protected object range and an editable range of a three-dimensional shape represented by three-dimensional shape data including three-dimensional position information as an editing control range, and a control condition setting unit that sets a control condition for controlling editing within the editable range so as not to edit the protected object range when the editable range is edited.
The invention with the application number of CN201910461221.1 discloses a page processing method, a page processing device and a page processing medium, wherein the page processing method comprises the steps of receiving an editing instruction for a target page, determining a target area of the target page, acquiring an image to be edited corresponding to the target area, determining a three-dimensional model to be edited corresponding to the image to be edited in a three-dimensional scene according to the three-dimensional scene corresponding to the target page, acquiring adsorption area information corresponding to the three-dimensional model to be edited in the three-dimensional scene, and editing the three-dimensional model to be edited in the three-dimensional scene according to the adsorption area information and editing information carried by the editing instruction to obtain the target image. The three-dimensional model to be edited can be conveniently moved in the three-dimensional scene according to the editing requirement. The realization that a user edits the three-dimensional model with finer granularity based on the page is ensured, and the use experience of interaction between the user and the three-dimensional model is improved.
Disclosure of Invention
The editing referred to in the above patent is also only a 3D editing work in a 2D display state. In the 3D era, 3D has brought a series of changes, 3D display, 3D interaction, and 3D editing. The 3D visual editor based on the time axis can effectively improve the efficiency and effect of space three-dimensional editing, and enables only a few people to present an immersive editing state in the brain, and enables more people to enjoy convenience and intuitiveness brought by immersive 3D display and editing.
The invention with application number CN201811594553.9 provides a head-mounted computer, which has high portability, convenient use and good privacy. Meanwhile, the invention also aims to provide an information input method based on the head-mounted computer, which breaks through the conventional input device and can input and edit information at any time. It is helpful to provide a solution in terms of information input under immersive 3D display.
The invention provides a 3D visual editor and an editing method based on a time axis, and aims to provide a new immersive editing tool and an editing mode under a 3D display effect, so that a user can conveniently and quickly use a head-mounted display (AR/VR/MR) to perform immersive editing work based on the time axis on the basis of an XR technology, and the immersive editing tool and the editing method are suitable for a wide range of people.
The technical scheme provided by the invention is as follows.
The invention provides a 3D visual editor based on a time axis, which at least comprises VR/AR/MR glasses (generally called XR glasses), wherein the XR glasses are used for displaying at least one time axis; marking at least one time node on the time axis; at least one virtual electronic tag is created on the time node; and adding at least one piece of editable information related to the time node into the virtual electronic tag.
Further, the editable information related to the time node at least comprises text information, picture information, video information, audio information, character information, location information, time information, files or hyperlinks, wherein the text information is used for adding, deleting and modifying text contents related to the time node; the picture information is used for adding, deleting and modifying picture contents related to the time node; the video information is used for adding, deleting and modifying the video content related to the time node; the audio information is used for adding, deleting and modifying the audio content related to the time node; the personal information comprises personal name, gender, age, native place, clothes and/or preference; the location information comprises country, region, community, street, building, floor, room number and/or longitude and latitude; the time information comprises a metric, a year, a month, a date and a time; the file includes a text format, a spreadsheet format, a picture format, a video format, an audio format, a map format, a mail format, a CAD format, or an application format.
Further, one of the timelines can be separately displayed into at least two timelines at the time node according to the instruction.
Further, the XR glasses are configured to display at least two of the time axes, where the two time axes are displayed as coplanar parallel, temporal close, temporal far away but not intersecting and/or at least once intersecting according to the instructions.
Further, the timeline displayed laterally in the field of view of the XR glasses for a user wearing the XR glasses moving in space marks the time nodes on the timeline, thereby creating the virtual electronic tags on the time nodes.
Further, the time axis of the lateral display is wholly/partially raised/lowered according to the instruction.
Further, around the time axis, widget software including, but not limited to, a calculator, a clock, a calendar, weather, color marks, or place marks is displayed in an overlay.
Further, the time node is disposed at a lower edge, an upper edge, a left edge, a right edge, a rear edge or a front edge of the time axis, and correspondingly, the virtual electronic tag is disposed at the upper edge, the lower edge, the right edge, the left edge, the front edge or the rear edge of the time axis.
Further, according to the instruction, the virtual electronic tags are approximately vertical to the time axis and displayed in a front-back stacking mode, or approximately parallel to the time axis and displayed in a top-down stacking mode; the virtual electronic tag is arranged on the other side of the corresponding time node.
Further, the virtual electronic tags displayed in a stacked manner are displayed in a non-stacked and tiled manner in a plane, a curved surface or a flat space in the field of view of the XR glasses according to the instruction, or displayed in a non-stacked and tiled manner in a horizontal periphery with the XR glasses as the center.
Further, the plurality of time nodes and the corresponding plurality of virtual electronic tags are approximately perpendicular to the time axis, and according to the instruction, the corresponding time nodes and the corresponding virtual electronic tags face the XR glasses and make directional rotation.
Further, the XR glasses at least comprise a spatial positioning sensor, an information input device and a display output device, wherein the spatial positioning sensor is used for positioning the movement of the XR glasses in space; the information input device is used for recording input information and sending an input instruction; and the display output device is used for synchronously displaying and outputting according to the input instruction.
Further, the spatial positioning sensor at least comprises a TOF camera, a fisheye camera, an IMU sensor, a gyroscope, a lighthouse positioning system, a WIFI positioning system, a LIF positioning system, a UWB positioning system, a sound positioning system, a laser radar and/or a millimeter wave radar; the information input device at least comprises a microphone, a control handle, a gesture recognizer, a motion catcher, a digital glove, a head aiming device, an eye tracker, a mouse, a track ball or a keyboard; the display output device is a 3D display device.
The invention also provides a using method of the 3D visual editor based on the time axis, wherein at least one time axis is displayed in the view field of XR glasses; marking at least one time node on the time axis; creating at least one virtual electronic tag on the time node; and adding at least one piece of editable information related to the time node into the virtual electronic tag.
Further, at the time node of one time axis, at least two time axes are displayed in a separating mode according to instructions.
Further, at least two of the time axes are displayed in the field of view of the XR glasses, the two time axes being displayed coplanar, parallel to the curved surface, temporally close to but temporally far from but not intersecting and/or at least once intersecting.
Further, the timeline is displayed laterally in the field of view of the XR glasses for marking the time nodes on the timeline, thereby creating the virtual electronic tags on the time nodes.
Further, raising/lowering the time axis of the lateral display in whole/in part; alternatively, the time axis of the vertical display is wholly/partially shifted left/right.
Further, the time node is arranged at the lower edge, the upper edge, the left edge, the right edge, the rear edge or the front edge of the time axis, and correspondingly, the virtual electronic tag is arranged at the upper edge, the lower edge, the right edge, the left edge, the front edge or the rear edge of the time axis.
Further, the virtual electronic tags are approximately vertical to the time axis and displayed in a front-back stacking mode, or approximately parallel to the time axis and displayed in a top-down stacking mode; the virtual electronic tag is arranged on the other side of the corresponding time node.
Further, the virtual electronic tags displayed in a stacked manner are displayed in an unstacked, tiled display in a flat/flat space in the field of view of the XR glasses, or in a horizontally centered, perimeter of the XR glasses.
Further, a plurality of the time nodes and a corresponding plurality of the virtual electronic tags are approximately perpendicular to the time axis, and the corresponding time nodes and the virtual electronic tags face the XR glasses and make directional rotation.
Further, the XR glasses at least comprise a spatial positioning sensor, an information input device and a display output device, wherein the spatial positioning sensor positions the movement of the XR glasses in space; the information input device records input information and sends an input instruction; and the display output device synchronously displays and outputs according to the input instruction.
Compared with the prior art, the invention is based on an immersive 3D display technology, and then carries out editing work on the displayed 3D time axis. In the 3D era, 3D has brought a series of changes, 3D display, 3D interaction, and 3D editing. The 3D visual editor based on the time axis can effectively improve the efficiency and effect of space three-dimensional editing, and enables only a few people to present an immersive editing state in the brain, and enables more people to enjoy convenience and intuitiveness brought by immersive 3D display and editing.
Drawings
The foregoing features, technical features, advantages and embodiments of one or more embodiments are described in detail below in a clearly understood manner with reference to the accompanying drawings.
FIG. 1 is a schematic perspective view of an editing interface based on time axis as viewed through XR glasses.
FIG. 2 is a schematic diagram of a top view of an editing interface based on a timeline according to the present invention.
FIG. 3 is a schematic diagram of a 3D structure of a time axis branching situation according to the present invention.
FIG. 4 is a schematic 3D structure of two time axes according to the present invention.
FIG. 5 is a schematic diagram of a 3D structure of the present invention for two crossing time axes.
Fig. 6 is a flow chart of daily life of a timeline according to the present invention.
Fig. 7 is a schematic 3D structure of VR glasses used in the present invention.
Fig. 8 is a schematic 3D structure of AR glasses used in the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
The embodiments described below by referring to the drawings, in which the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout, are exemplary only for explaining the present invention, and are not construed as limiting the present invention.
In describing the present invention, it is to be understood that the terms: the terms center, longitudinal, lateral, length, width, thickness, up, down, front, back, left, right, vertical, horizontal, top, bottom, inside, outside, clockwise, counterclockwise, and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing and simplifying the description, and thus, should not be construed as limiting the present invention. Furthermore, the terms: first, second, etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features shown. In describing the present invention, unless otherwise expressly specified or limited, the terms: mounting, connecting, etc. should be understood broadly, and those skilled in the art will understand the specific meaning of the terms in this application as they pertain to the particular situation.
Example one
In an embodiment of the apparatus according to the present invention, as shown in fig. 1, fig. 7 and fig. 8, a 3D visualization editor (XR glasses) 100 based on time axis at least includes: XR display device 1/10, through immersive XR display device 1, may display a time axis 80 in (virtual) space, as shown in fig. 1, on which time axis 80 are marked 4 time nodes 70, 4 time node 70 code names (4 time nodes 70 are respectively) 7/month 1/1987, 6/month 12/1999, 2010 3/month 30/2018, and 6/month 23/2018 (the specific time has no specific meaning, and is for illustration only), and corresponding are 4 virtual electronic tags 90, the 4 events are respectively marked on the 4 time nodes 70, and the filled-in code names: the events A to D are marked in 4 virtual electronic tags 90 respectively,
it is worth mentioning that the virtual electronic tag 90 may also be created after the time node 70 is created, and the initially blank virtual electronic tag 90 is as illustrated in the lower left corner of fig. 1. If a virtual electronic tag 90 is created before a time node 70 is created, it can only hover in space, not corresponding to any time node 70 on the time axis 80. A plurality of virtual electronic tags 90 may be created on a corresponding one of the time nodes 70.
Information such as the specific content of the event can be further added (edited after adding) to the virtual electronic tag 90. In this manner, the user wearing the XR device 1/10 may be facilitated to visually and intuitively establish the temporal organic link, such as in police crime or business grooming, for example, which may be very convenient for the user to visually analyze the temporal relevance of an isolated event.
The time axis in this space can be displayed either horizontally (extended spread) as shown in fig. 1 or vertically (extended). The timeline 80 may be displayed as a whole, raised/lowered, or as a partial section of the timeline 80.
Around the time axis 80, in a spatial (arbitrary) location, widget software is displayed superimposed, including but not limited to a clock, calendar, address book, trip, map, calculator, clipboard, weather, CPU/MEM dashboard, system monitor, browser, OFFICE tool, game hall, virtual keyboard, color-marking virtual pen, or place-marking virtual pen.
As shown in fig. 7, the VR glasses body 10 includes an external environment camera 30 and a gesture input device 20 (e.g., a gesture camera). In the VR glasses, the processing center and the memory module may be integrated into the glasses body 10.
As shown in fig. 8, the AR eyeglass body 1 includes an AR display eyeglass lens 2, and further includes: the device comprises a voice input device (such as a microphone 5), cameras 6 (such as RGB cameras, depth cameras, infrared cameras and the like) and gesture input devices (such as gesture cameras 4) which can be symmetrically arranged on two sides of AR glasses and earphones 3 in pairs, wherein the microphones 5 are arranged on human face accessories through connecting pieces in a conventional installation mode, the cameras 6 are arranged at the upper ends of the centers of glasses lenses 2, and the gesture cameras 4 are arranged on the side portions of AR glasses bodies. In the AR glasses, the processing center and the memory module may be integrated into the glasses body 1.
Example two
According to the above embodiment of the present invention, as shown in fig. 2, a plurality of virtual electronic tags 90 can be arbitrarily added to any time node 70 of a time axis 80 displayed in a virtual space through an immersive XR display device 1/2, and specific content information including text information, picture information, video information, audio information, character information, location information, time information, files or hyperlinks can be added to the virtual electronic tags 90. Further, the file includes a text format, a spreadsheet format, a picture format, a video format, an audio format, a map format, a mail format, a CAD format, or an application format.
A plurality of the time nodes 70 and a corresponding plurality of the virtual electronic tags 90 are approximately perpendicular to the time axis 80 and are displayed (not) stacked and tiled in one plane, curved or flat space in the field of view of the XR glasses or in a horizontal perimeter centered on the XR glasses according to display/editing instructions. The layers are stacked with the shields in front and back, and are not stacked with no shields.
According to the display/edit instruction, the corresponding time node 70 and the virtual electronic tag 90 face the XR glasses 1/10 and make a directional rotation (similar to a positive rotation of a sunflower). That is, as shown in fig. 7 and 8, the XR glasses 1/10 include indoor/outdoor positioning sensors such as a TOF camera, a fisheye camera, an IMU sensor, a gyroscope, a lighthouse positioning system, a WIFI positioning system, a LIFI positioning system, a UWB positioning system, a sound positioning system, a laser radar, and/or a millimeter wave radar, and when a user walks in space with the XR glasses 1/10, the time node 70 and the virtual electronic tag 90 may face the position of the XR glasses 1/10 all the time and perform steering movement.
Also provided in the XR glasses 1/10 are information input devices (including a microphone, joystick, gesture recognizer, motion capture, digital gloves, head pointing device, eye tracker, mouse, trackball, or keyboard), and display output devices (3D display device, VR display or AR display).
The information input mode on the XR equipment is also diversified, including head sight (helmet sight), eye tracking, sound, wisdom gloves, brake valve, free-hand gesture, has all greatly enriched XR equipment's realization probably to and the range of application.
EXAMPLE III
According to the above two embodiments of the apparatus/system of the present invention, as shown in fig. 3, one of the time axes that can be displayed in the virtual space through the XR display device 1/2 can be separately displayed at the time node 70 (6/5/2017) as at least two of the time axes according to the display/edit command.
As shown in fig. 4 and 5, two timelines 80 may be displayed in virtual space through an immersive XR display device 1/2. The two time axes 80 are displayed parallel to the same plane, parallel to the same curved plane, close to each other in time, but not intersecting and/or intersecting at least once, according to the display/edit instruction. The time node for handover was 5 months and 15 days 2009.
In some special transaction cards, special situations are encountered, such as shown in fig. 3, a family is taken as a time axis 80, a child is born at a certain time node 70 (6 months and 5 days 2017) of the time axis, then father goes out to work as a time axis 81, and mother and child stay at home as a time axis 82.
As shown in fig. 5, at a certain time node 70 (5/15 in 2009), the person a and the person B meet each other as a time axis 85 and a life locus thereof.
Example four
According to the editor functionality of the above embodiments, as shown in fig. 6, a personalized timeline reminder may be generated in a customized manner, and a user may display a private timeline in a virtual space via XR display 1/10, where the private timeline has a plurality of time nodes and virtual e-tags of the current day. The user is given specific content reminders each time a reminder is given according to the clock in the XR display device 1/10.
Besides the reminding function, the method can also be used for recording based on time and setting related memory tags, children are born in international women and infants hospitals at a certain time node, the whole family goes to London for tourism in the last year, and virtual electronic tags are created at the time nodes and added with characters or voice feelings, photos or videos. This is equivalent to giving an agent, an enterprise or a family, outlining the life track of the individual, and the occurrence and development tracks of the family and the enterprise.
It should be noted that, the step identifiers in the flowcharts in this patent specification are not limited to a sequence, and the sequence may be adjusted under reasonable conditions.
It should be noted that the above embodiments can be freely combined as necessary. The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A3D visualization editor based on time axis, characterized in that it comprises at least an XR glasses,
the XR glasses are used for displaying at least one time axis;
marking at least one time node on the time axis;
at least one virtual electronic tag is created on the time node;
and adding at least one piece of editable information related to the time node into the virtual electronic tag.
2. The 3D visual editor of claim 1 wherein the time node related editable information comprises at least textual information, pictorial information, video information, audio information, persona information, location information, temporal information, files, or hyperlinks,
the text information is used for adding, deleting and modifying text contents related to the time node;
the picture information is used for adding, deleting and modifying picture contents related to the time node;
the video information is used for adding, deleting and modifying video contents related to the time node;
the audio information is used for adding, deleting and modifying audio contents related to the time node;
the personal information comprises personal name, gender, age, native place, clothes and/or preference;
the location information comprises country, region, community, street, building, floor, room number and/or longitude and latitude;
the time information comprises a metric, a year, a month, a date and a time;
the file includes a text format, a spreadsheet format, a picture format, a video format, an audio format, a map format, a mail format, a CAD format, or an application format.
3. The 3D visualization editor of claim 1 wherein one of the timelines is separable into at least two of the timelines at the temporal node upon instruction.
4. The 3D visualization editor of claim 1 wherein the XR glasses are configured to display at least two of the timelines that are, upon command, displayed as being coplanar, parallel to a surface, temporally proximate to, temporally distant from, but not intersecting, and/or at least one intersecting.
5. A use method of a 3D visualization editor based on time axis is characterized in that at least one time axis is displayed in a visual field of XR glasses;
marking at least one time node on the time axis;
creating at least one virtual electronic tag on the time node;
and adding at least one piece of editable information related to the time node into the virtual electronic tag.
6. The use of claim 5, wherein at said time node of one of said timelines, it is displayed separately according to instructions into at least two of said timelines.
7. Use according to claim 5, wherein at least two of said time axes are displayed in the XR glasses field of view, said time axes being displayed as coplanar, temporally close to temporally far away but not intersecting and/or at least once intersecting.
8. Use of the method according to claim 5, 6 or 7, wherein the timeline is displayed laterally in the XR glasses' field of view for marking the time node on the timeline, thereby creating the virtual electronic tag on the time node.
9. Use according to claim 8, characterized in that the time axis of the lateral display is raised/lowered in whole/in part; alternatively, the time axis of the vertical display is wholly/partially shifted left/right.
10. The use method according to claim 8, wherein the time node is arranged at the lower edge, upper edge, left edge, right edge, rear edge or front edge of the time axis, and correspondingly, the virtual electronic tag is arranged at the upper edge, lower edge, right edge, left edge, front edge or rear edge of the time axis.
CN201911130859.3A 2019-11-19 2019-11-19 3D visual editor and editing method based on time axis Pending CN110956702A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911130859.3A CN110956702A (en) 2019-11-19 2019-11-19 3D visual editor and editing method based on time axis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911130859.3A CN110956702A (en) 2019-11-19 2019-11-19 3D visual editor and editing method based on time axis

Publications (1)

Publication Number Publication Date
CN110956702A true CN110956702A (en) 2020-04-03

Family

ID=69977654

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911130859.3A Pending CN110956702A (en) 2019-11-19 2019-11-19 3D visual editor and editing method based on time axis

Country Status (1)

Country Link
CN (1) CN110956702A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113691854A (en) * 2021-07-20 2021-11-23 阿里巴巴达摩院(杭州)科技有限公司 Video creation method and device, electronic equipment and computer program product
CN116204167A (en) * 2023-04-27 2023-06-02 杭州朗迅科技股份有限公司 Method and system for realizing full-flow visual editing Virtual Reality (VR)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040130566A1 (en) * 2003-01-07 2004-07-08 Prashant Banerjee Method for producing computerized multi-media presentation
CN106162358A (en) * 2016-06-30 2016-11-23 乐视控股(北京)有限公司 A kind of VR plays control method and the equipment of video progress
US20170091977A1 (en) * 2015-09-24 2017-03-30 Unity IPR ApS Method and system for a virtual reality animation tool
CN107178870A (en) * 2017-05-04 2017-09-19 珠海格力电器股份有限公司 Multi-medium data playback equipment, air conditioning control method and device
CN107945270A (en) * 2016-10-12 2018-04-20 阿里巴巴集团控股有限公司 A kind of 3-dimensional digital sand table system
CN108446017A (en) * 2018-02-12 2018-08-24 天津大学 A kind of ancient wall disease visual analysis method based on MR glasses
CN108876687A (en) * 2018-07-20 2018-11-23 武汉虹信技术服务有限责任公司 A kind of system and method marked on the electronic map and recall community policy event
CN109324690A (en) * 2018-10-24 2019-02-12 深圳市领点科技有限公司 A kind of method for reproducing and system of the teaching resource based on Virtual Reality Platform
CN109887097A (en) * 2019-02-01 2019-06-14 河南众诚信息科技股份有限公司 A kind of VR content development platform and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040130566A1 (en) * 2003-01-07 2004-07-08 Prashant Banerjee Method for producing computerized multi-media presentation
US20170091977A1 (en) * 2015-09-24 2017-03-30 Unity IPR ApS Method and system for a virtual reality animation tool
CN106162358A (en) * 2016-06-30 2016-11-23 乐视控股(北京)有限公司 A kind of VR plays control method and the equipment of video progress
CN107945270A (en) * 2016-10-12 2018-04-20 阿里巴巴集团控股有限公司 A kind of 3-dimensional digital sand table system
CN107178870A (en) * 2017-05-04 2017-09-19 珠海格力电器股份有限公司 Multi-medium data playback equipment, air conditioning control method and device
CN108446017A (en) * 2018-02-12 2018-08-24 天津大学 A kind of ancient wall disease visual analysis method based on MR glasses
CN108876687A (en) * 2018-07-20 2018-11-23 武汉虹信技术服务有限责任公司 A kind of system and method marked on the electronic map and recall community policy event
CN109324690A (en) * 2018-10-24 2019-02-12 深圳市领点科技有限公司 A kind of method for reproducing and system of the teaching resource based on Virtual Reality Platform
CN109887097A (en) * 2019-02-01 2019-06-14 河南众诚信息科技股份有限公司 A kind of VR content development platform and method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113691854A (en) * 2021-07-20 2021-11-23 阿里巴巴达摩院(杭州)科技有限公司 Video creation method and device, electronic equipment and computer program product
CN116204167A (en) * 2023-04-27 2023-06-02 杭州朗迅科技股份有限公司 Method and system for realizing full-flow visual editing Virtual Reality (VR)
CN116204167B (en) * 2023-04-27 2023-08-15 杭州朗迅科技股份有限公司 Method and system for realizing full-flow visual editing Virtual Reality (VR)

Similar Documents

Publication Publication Date Title
AU2021258005B2 (en) System and method for augmented and virtual reality
US11601484B2 (en) System and method for augmented and virtual reality
US20210312887A1 (en) Systems, methods, and media for displaying interactive augmented reality presentations
CN110956702A (en) 3D visual editor and editing method based on time axis
CN110888530A (en) 3D visual editor and editing method based on electronic map
US20240070301A1 (en) Timelapse of generating a collaborative object
TW202347261A (en) Stereoscopic features in virtual reality
NZ765499B2 (en) System and method for augmented and virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200403

WD01 Invention patent application deemed withdrawn after publication