CN114760418A - 360-degree full-view-angle AR monitoring method and system and AR glasses - Google Patents

360-degree full-view-angle AR monitoring method and system and AR glasses Download PDF

Info

Publication number
CN114760418A
CN114760418A CN202210455054.1A CN202210455054A CN114760418A CN 114760418 A CN114760418 A CN 114760418A CN 202210455054 A CN202210455054 A CN 202210455054A CN 114760418 A CN114760418 A CN 114760418A
Authority
CN
China
Prior art keywords
image
environment
images
abnormal
environment image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210455054.1A
Other languages
Chinese (zh)
Other versions
CN114760418B (en
Inventor
吕新伟
王加辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tmva Shanghai Network Technology Co ltd
Original Assignee
Tmva Shanghai Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tmva Shanghai Network Technology Co ltd filed Critical Tmva Shanghai Network Technology Co ltd
Priority to CN202210455054.1A priority Critical patent/CN114760418B/en
Publication of CN114760418A publication Critical patent/CN114760418A/en
Application granted granted Critical
Publication of CN114760418B publication Critical patent/CN114760418B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a 360-degree full-view angle AR monitoring method, a system and AR glasses, wherein the method comprises the following steps: acquiring an environment image through a plurality of preset cameras, and splicing the environment images acquired by the cameras into a 360-degree panoramic environment image; dividing the 360-degree panoramic environment image into a preset number of subarea images, and giving a corresponding position code to each subarea image; identifying abnormal images in each subarea image; generating prompt information according to the position codes corresponding to the abnormal images; according to a first environment image containing a complete abnormal image in the environment image, a first AR image corresponding to the first environment image is generated, and the first AR image is displayed in a main visual angle of a user. The invention can reduce the influence of AR projection of other visual angles on the main visual angle of the user in the monitoring process when the user simultaneously observes the omnibearing visual field, improves the efficiency of confirming the direction of the abnormal image and is convenient for the user to timely act on the peripheral abnormal condition.

Description

360-degree full-view-angle AR monitoring method and system and AR glasses
Technical Field
The invention relates to the technical field of AR monitoring, in particular to a 360-degree full-view AR monitoring method and system and AR glasses.
Background
At present, in the technical fields of rescue, maintenance, monitoring and other technologies requiring application of an all-directional visual angle, a user needs to concentrate on attention to observe the surrounding environment in an all-directional manner by 360 degrees, so that corresponding judgment can be made in time when abnormal conditions occur around the user, but due to the limited visual angle range of human eyes, the user cannot observe all-directional visual fields simultaneously.
At present, the AR technology is usually adopted to project images of other viewing angles to a user main viewing angle, so that the user can observe the situation of each viewing angle at the same time, but the AR imaging method can always project images of all viewing angles, and the display method can obviously cause the following problems: firstly, inconvenience is brought to a user for observing a main visual angle; secondly, when the orientation of the abnormal image is confirmed, the full-view image and the actual orientation are matched, thereby influencing the action efficiency of the user.
Therefore, a 360-degree full-view AR monitoring method is needed at present, which is suitable for a user to observe a scene with an omnidirectional view simultaneously, reduces the influence of AR projection of other views on the main view of the user in the monitoring process, and facilitates the user to perform actions on surrounding abnormal conditions in time according to the 360-degree full view.
Disclosure of Invention
In order to solve the technical problems that in the prior art, an AR imaging method can always project images of all visual angles, inconvenience is brought to a user for observing a main visual angle, and influence is caused on the action of the user, the invention provides a 360-degree full-visual-angle AR monitoring method, a system and AR glasses, and the specific technical scheme is as follows:
the invention provides a 360-degree full-view AR monitoring method which is characterized by comprising the following steps:
acquiring an environment image through a plurality of preset cameras, and splicing the environment image acquired by the plurality of cameras into a 360-degree panoramic environment image;
dividing the 360-degree panoramic environment image into a preset number of subarea images, and giving a position code corresponding to each subarea image;
identifying abnormal images in each subregion image;
generating prompt information according to the position code corresponding to the abnormal image;
and generating a first AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, and displaying the first AR image in a main visual angle of a user.
The 360-degree full-view-angle AR monitoring method provided by the invention has the advantages that after the environment image is collected and spliced into the 360-degree panoramic environment image, the 360-degree panoramic environment image is divided into all the subarea images, the abnormal images in the subarea images are identified, the AR images of the abnormal images are displayed, and the positions corresponding to the abnormal images of the user are prompted, so that the user can observe the AR images of the abnormal images only when the abnormal images exist, the interference on the main view angle of the user is reduced, and meanwhile, the user can conveniently and timely respond to the AR images of the abnormal images and the positions corresponding to the abnormal images.
In some embodiments, the 360-degree full-view AR monitoring method provided by the present invention is applied to an annular wearing device, and electric vibrators are respectively arranged in advance in corresponding directions of each sub-region image on the wearing device, so that the electric vibrators are in contact with the user;
generating prompt information according to the position code corresponding to the abnormal image, specifically comprising:
generating a prompt message according to the position code corresponding to the abnormal image, and then generating a vibration instruction according to the prompt message;
sending the vibration instruction to the corresponding electric vibrator;
and controlling the corresponding electric vibrator to vibrate according to the vibration instruction.
According to the 360-degree full-view-angle AR monitoring method provided by the invention, the electric vibrator which is in contact with the skin touch nerves of the head of the user is arranged in the corresponding direction of each sub-area image on the wearing equipment, so that the user can directly judge the direction of the abnormal condition according to the vibration of the electric vibrator in the corresponding direction, and the effect that the user can obtain the abnormal condition in the surrounding environment without turning around is realized by combining with sensory information.
In some embodiments, the vibrators are arranged at 45-degree intervals;
the dividing the 360-degree panoramic environment image into a preset number of subarea images specifically comprises:
and dividing the 360-degree panoramic environment image into 8 subarea images according to a 45-degree view angle.
According to the 360-degree full-view-angle AR monitoring method, the interval between every two oscillators is set to be 45 degrees, the 360-degree panoramic environment image is divided into 8 areas according to the 45-degree view angle, a user can conveniently and accurately judge the direction of an abnormal situation according to the vibration information of 8 directional oscillators, and the situation that the user judges the direction of the abnormal image to be wrong due to the fact that the number of the oscillators is too large is avoided.
In some embodiments, after generating the AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, the method further includes:
generating a second AR image corresponding to the prompt information according to the prompt information;
displaying the first AR image and the second AR image in a primary perspective of the user.
According to the 360-degree full-view AR monitoring method provided by the invention, the AR image of the prompt message is displayed to the user, so that the user can visually check the position code corresponding to the abnormal image, and the user can conveniently act on the abnormal image in time.
In some embodiments, after generating a first AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, the method further includes:
generating a third AR image corresponding to the second environment image according to the second environment image which comprises the complete main visual angle of the user in the environment image;
displaying the first AR image and the third AR image in a primary perspective of the user.
The 360-degree full-view AR monitoring method provided by the invention generates a corresponding AR picture according to an environment picture shot by a camera collecting the main view angle of a user, displays the AR picture of the main view angle of the user in the main view angle of the user, and is suitable for a scene where the user is inconvenient to directly observe the main view angle.
In some embodiments, after generating the prompt information according to the position code corresponding to the abnormal image, the method further includes:
generating a third environment image with a preset size by taking the abnormal image as a center in the 360-degree panoramic environment image;
and generating a fourth AR image corresponding to the third environment image, and displaying the fourth AR image in the main view angle of the user.
After the abnormal image is identified, the 360-degree full-view AR monitoring method provided by the invention generates and displays the third environment image with the preset size by taking the abnormal image as the center, so that a user can observe the condition of the abnormal image more intuitively in the main view.
In some embodiments, the present invention further provides a 360-degree full-view AR monitoring system, comprising:
the acquisition module is used for acquiring an environment image through a plurality of preset cameras and splicing the environment image acquired by the plurality of cameras into a 360-degree panoramic environment image;
the segmentation module is connected with the acquisition module and used for segmenting the 360-degree panoramic environment image into a preset number of subarea images and endowing each subarea image with a corresponding position code;
the identification module is connected with the segmentation module and used for identifying abnormal images in the subarea images;
the first generation module is respectively connected with the segmentation module and the identification module and is used for generating prompt information according to the position code corresponding to the abnormal image;
and the second generation module is respectively connected with the identification module and the acquisition module and is used for generating a first AR image corresponding to the first environment image according to the first environment image containing the complete abnormal image in the environment image and displaying the first AR image in the main visual angle of the user.
In some embodiments, the present invention also provides AR glasses comprising:
the cameras are used for acquiring environment images;
the processor is connected with the cameras and used for splicing the environment images collected by the cameras into 360-degree panoramic environment images, dividing the 360-degree panoramic environment images into a preset number of subarea images, giving a position code corresponding to each subarea image, identifying abnormal images in each subarea image, generating prompt information according to the position codes corresponding to the abnormal images, and generating a first AR image corresponding to the first environment image according to the first environment image which contains the complete abnormal images in the environment image;
display optics, coupled to the processor, for displaying an AR image of the anomalous image in a primary perspective of a user.
In some embodiments, the wearing device of the AR glasses is configured as a ring, and the electric vibrators are respectively disposed on the wearing device in the corresponding directions of the images of the sub-regions, and are in contact with the user;
the processor generates a vibration instruction according to the prompt information, sends the vibration instruction to the corresponding electric vibrator, and controls the corresponding electric vibrator to vibrate according to the vibration instruction.
The AR glasses provided by the invention have the advantages that the plurality of electric vibrators are arranged, and the effect that a user can acquire abnormal conditions in the surrounding environment without turning around is realized by combining sensory information.
In some embodiments, the processor is independently disposed outside the AR glasses, and is respectively in communication connection with the display lens, the plurality of cameras, and each of the electric vibrators.
The AR glasses provided by the invention are independently provided with the processor, so that the functional requirements of the AR glasses are met and the size of the AR glasses is reduced under the condition that the processor is in communication connection with the display lens, the plurality of cameras and each electric vibrator.
The invention provides a 360-degree full-view AR monitoring method and system and AR glasses, which at least comprise the following technical effects:
(1) after an environment image is collected and spliced into a 360-degree panoramic environment image, the 360-degree panoramic environment image is divided into all subarea images, abnormal images in the subarea images are identified, AR images of the abnormal images are displayed, and positions corresponding to the abnormal images are prompted to a user, so that the user can observe the AR images of the abnormal images only when the abnormal images exist, the interference to a main visual angle of the user is reduced, and meanwhile, the user can timely respond to the AR images of the abnormal images and the positions corresponding to the abnormal images;
(2) the electric vibrators in contact with the skin touch nerves of the head of a user are arranged in the corresponding directions of the images of all the subareas on the wearing equipment, so that the user can directly judge the direction of the abnormal condition according to the vibration of the electric vibrators in the corresponding directions, and the effect that the abnormal condition in the surrounding environment can be obtained without turning the head of the user is realized by combining sensory information;
(3) by setting the interval between each vibrator to be 45 degrees and dividing the 360-degree panoramic environment image into 8 areas according to the 45-degree view angle, a user can conveniently and accurately judge the direction of an abnormal situation according to the vibration information of the 8 directional vibrators, and the situation that the user judges the direction of the abnormal image to have errors due to the fact that the number of the vibrators is too large is avoided;
(4) by displaying the AR image of the prompt message to the user, the user can visually check the position code corresponding to the abnormal image, so that the user can act on the abnormal image in time;
(5) generating a corresponding AR picture according to an environment picture shot by a camera for collecting a main visual angle of a user, and displaying the AR picture of the main visual angle of the user in the main visual angle of the user, so that the method is suitable for a scene where the user cannot conveniently and directly observe the main visual angle;
(6) after the abnormal image is identified, a third environment image with a preset size is generated and displayed by taking the abnormal image as a center, so that a user can more intuitively observe the abnormal image in the main view angle.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings required to be used in the description of the embodiments will be briefly introduced below, and it is apparent that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings may be obtained according to the drawings without inventive labor.
FIG. 1 is a flow chart of a 360-degree full-view AR monitoring method according to the present invention;
FIG. 2 is a flowchart of generating a prompt message in a 360-degree full-view AR monitoring method according to the present invention;
FIG. 3 is a flowchart of displaying an AR image corresponding to a prompt message in a 360-degree full-view AR monitoring method according to the present invention;
FIG. 4 is a flowchart illustrating displaying a main view AR image of a user in a 360-degree full view AR monitoring method according to the present invention;
FIG. 5 is another flow chart of a 360 degree full view AR monitoring method of the present invention;
FIG. 6 is an exemplary diagram of a 360 degree full view AR monitoring system of the present invention;
FIG. 7 is an exemplary diagram of AR glasses of the present invention;
fig. 8 is another example of AR glasses according to the present invention.
The reference numbers in the figures: the system comprises an acquisition module-10, a segmentation module-20, an identification module-30, a first generation module-40, a second generation module-50 and AR glasses-100, and the system comprises a processor-110, a display lens-120, a camera-130 and an electric vibrator-140.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
For the sake of simplicity, the drawings only schematically show the parts relevant to the present invention, and they do not represent the actual structure as a product. In addition, in order to make the drawings concise and understandable, components having the same structure or function in some of the drawings are only schematically depicted, or only one of them is labeled. In this document, "one" means not only "only one" but also a case of "more than one".
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
An embodiment of the present invention, as shown in fig. 1, provides a 360-degree full-view AR monitoring method, including the steps of:
s100, acquiring an environment image through a plurality of preset cameras, and splicing the environment image acquired by the plurality of cameras into a 360-degree panoramic environment image.
Specifically, the setting of the cameras meets the requirement of an environment image shot by a plurality of cameras, the condition of containing 360-degree panoramic environment images can be met, the images shot by different cameras can have overlapped parts, after the environment image is collected, the 360-degree panoramic environment image is synthesized by an image synthesis algorithm, the image synthesis algorithm is realized by adopting the existing image synthesis model, redundant description is not carried out for avoiding repeated technical feature description, the camera collects the environment image in real time and synthesizes the environment image into the 360-degree panoramic environment image, then subsequent processing is executed, and the real-time performance of the 360-degree panoramic environment image is ensured.
S200, the 360-degree panoramic environment image is divided into a preset number of subarea images, and a position code corresponding to each subarea image is given.
Specifically, in the process of performing region segmentation on the 360-degree panoramic environment image, the 360-degree panoramic environment image is segmented into a plurality of sub-regions according to a preset view angle, for example, after the 360-degree panoramic environment image is segmented by the 90-degree view angle, the 360-degree panoramic environment image is segmented into four regions, the first region, the second region, the third region and the fourth region respectively encode the corresponding positions of the first region, the second region, the third region and the fourth region, the format of the position encoding is not limited, and an encoding format which can be arbitrarily distinguished from other regions can be adopted.
S300 identifies an abnormal image in each of the subarea images.
Specifically, an image recognition algorithm can be trained in advance according to the marked region image, so that the image recognition algorithm can recognize whether feature information exists in the image, the image recognition algorithm can adopt a conventional image recognition model to avoid repeated description of the prior art, so that technical features in the prior art are not explained in detail, and types of the feature information can be replaced according to different use scenes, such as obstacles, moving targets, targets too close to each other, objects to be detected with faults, and the like.
S410, prompt information is generated according to the position codes corresponding to the abnormal images.
Specifically, the prompt information includes visual information, auditory information, tactile information and other sensory information, for example, the prompt information may be an audio prompt, and a corresponding voice prompt is generated according to the position code corresponding to the abnormal image, so as to remind the user of the occurrence of the abnormality at the position code of the abnormal image.
S420, according to a first environment image containing the complete abnormal image in the environment image, a first AR image corresponding to the first environment image is generated, and the first AR image is displayed in the main visual angle of the user.
Illustratively, the environment images in the front, rear, left and right directions are acquired by 4 cameras respectively, after the sub-region images are divided, if an abnormal image exists in the environment picture acquired by the left camera, the environment picture acquired by the left camera is taken as a first environment picture, and an AR image corresponding to the first environment picture is generated and displayed.
Specifically, the execution sequence of step S410 and step S420 does not affect the technical effect of the present application, and S410 may be executed first and then S420 is executed, or S420 and then S410 is executed, as shown in fig. 1, which is a schematic diagram of S410 and S420 being executed simultaneously.
The 360-degree full-view-angle AR monitoring method provided by this embodiment is to collect an environment image and splice the environment image into a 360-degree panoramic environment image, divide the 360-degree panoramic environment image into each sub-area image, identify an abnormal image in the sub-area image, display an AR image of the abnormal image and prompt a user of a position corresponding to the abnormal image, so that the user observes the AR image of the abnormal image only when there is an abnormality, thereby reducing interference with a main view angle of the user, and facilitating the user to respond to the AR image of the abnormal image and the position corresponding to the abnormal image in time.
In one embodiment, as shown in fig. 2, the step S410 of generating the prompt information according to the position code corresponding to the abnormal image specifically includes:
s411, after generating the prompt information according to the position code corresponding to the abnormal image, generating a vibration command according to the prompt information.
Specifically, the 360-degree full-view-angle AR monitoring method provided by this embodiment is applied to an annular wearing device, and electric vibrators are respectively arranged in the positions corresponding to the sub-area images on the wearing device in advance, so that the electric vibrators are in contact with a user, for example, after being divided by a 1-degree view angle, the 360-degree panoramic environment image is divided into 180 sub-area images, and one electric vibrator is respectively arranged in 180 positions corresponding to the 180 sub-area images on the wearing device.
S412 sends the vibration command to the corresponding electrodynamic vibrator.
S413 controls the corresponding electrodynamic vibrator to vibrate according to the vibration command.
According to the 360-degree full-view-angle AR monitoring method provided by the embodiment, the electric vibrator which is in contact with the skin touch nerves of the head of the user is arranged in the corresponding direction of each sub-area image on the wearing equipment, so that the user can directly judge the direction of the abnormal situation according to the vibration of the electric vibrator in the corresponding direction, and the effect that the user can obtain the abnormal situation in the surrounding environment without turning around is achieved by combining sensory information.
In one embodiment, the oscillators are arranged at intervals of 45 degrees, and the step S200 of performing region segmentation on the 360-degree panoramic environment image specifically includes: the 360-degree panoramic environment image is divided into 8 areas according to a 45-degree view angle, and a vibrator is arranged in each of the 8 areas.
The 360-degree full-view-angle AR monitoring method provided by this embodiment sets the interval between each oscillator to 45 degrees, and divides the 360-degree panoramic environment image into 8 regions according to the 45-degree view angle, so that a user can accurately judge the direction of an abnormal situation according to the vibration information of 8 directional oscillators, and the situation that the user judges the direction of an abnormal image to have an error due to the excessive number of oscillator settings is avoided.
In one embodiment, as shown in fig. 3, step S420 generates a first AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, and displays the first AR image in the main viewing angle of the user, specifically including:
s421, according to a first environment image including a complete abnormal image in the environment image, a first AR image corresponding to the first environment image is generated.
S422, according to the prompt message, generating a second AR image corresponding to the prompt message.
S423 displays the first AR image and the second AR image in the main viewing angle of the user.
Specifically, in the present embodiment, step S410 is executed first, and then steps S421 to S423 are executed.
According to the 360-degree full-view AR monitoring method provided by the embodiment, the AR image of the prompt message is displayed to the user, so that the user can visually check the position code corresponding to the abnormal image, and the user can conveniently act on the abnormal image in time.
In one embodiment, as shown in fig. 4, in step S420, according to a first environment image including a complete abnormal image in the environment image, a first AR image corresponding to the first environment image is generated, and the first AR image is displayed in a main viewing angle of a user, which specifically includes:
s421, according to the first environment image including the complete abnormal image in the environment image, a first AR image corresponding to the first environment image is generated.
S424 generates a third AR image corresponding to the second environment image according to the second environment image including the main view of the complete user in the environment image.
S425 displays the first AR image and the third AR image in a main viewing angle of the user.
Specifically, when step S425 is executed, the second AR image corresponding to the guidance information may be displayed at the same time.
The 360-degree full-view AR monitoring method provided by the invention generates a corresponding AR picture according to an environment picture shot by a camera collecting the main view angle of a user, displays the AR picture of the main view angle of the user in the main view angle of the user, and is suitable for a scene where the user is inconvenient to directly observe the main view angle.
In one embodiment, as shown in fig. 5, after the step S410 generates the prompt information according to the position code corresponding to the abnormal image, the method further includes the steps of:
s426 generates a third environment image of a preset size centering on the abnormal image in the 360-degree panoramic environment image.
Illustratively, an environment image with a view angle of 90 degrees is cropped with the abnormal image as a center in the 360-degree panoramic environment image as a third environment image.
S427 generates a fourth AR image corresponding to the third environment image and displays the fourth AR image in the main viewing angle of the user.
After the abnormal image is identified, the 360-degree full-view AR monitoring method provided by the invention generates and displays the third environment image with the preset size by taking the abnormal image as the center, so that a user can observe the condition of the abnormal image more intuitively in the main view.
In one embodiment, the AR image of the abnormal image is transmitted to the preset abnormal image display area in the main viewing angle of the user, so as to further reduce the influence on the main viewing angle of the user, and at the same time, the user can observe the AR image of the abnormal image in the preset abnormal image display area, thereby avoiding inconvenience caused by confusion between the main viewing angle image and the AR image of the abnormal image.
In one embodiment, as shown in fig. 6, the present invention further provides a 360-degree full-view AR monitoring system, which includes an acquisition module 10, a segmentation module 20, an identification module 30, a first generation module 40, and a second generation module 50.
The acquisition module 10 is configured to acquire an environment image through a plurality of preset cameras, and splice the environment image acquired by the plurality of cameras into a 360-degree panoramic environment image.
Specifically, the arrangement of the cameras meets the requirement that in the environment images shot by the cameras, the environment images contain 360-degree panoramic environment images, the images shot by different cameras can have overlapped parts, after the environment images are collected, the 360-degree panoramic environment images are synthesized by an image synthesis algorithm, the image synthesis algorithm is realized by adopting the existing image synthesis model, redundant explanation is not carried out for avoiding repeated technical feature description, the environment images are collected by the cameras in real time and synthesized into the 360-degree panoramic environment images, subsequent processing is executed, and the real-time performance of the 360-degree panoramic environment images is guaranteed.
The dividing module 20 is connected to the collecting module 10, and is configured to divide the 360-degree panoramic environment image into a preset number of sub-region images, and assign a position code corresponding to each sub-region image.
Specifically, in the process of performing region segmentation on the 360-degree panoramic environment image, the 360-degree panoramic environment image is segmented into a plurality of sub-regions according to a preset view angle, for example, after the 360-degree panoramic environment image is segmented by the 90-degree view angle, the 360-degree panoramic environment image is segmented into four regions, the first region, the second region, the third region and the fourth region respectively encode the corresponding positions of the first region, the second region, the third region and the fourth region, and the format of the position encoding is not limited, and any encoding format different from other regions can be adopted.
The identification module 30 is connected to the segmentation module 20 and is configured to identify an abnormal image in each of the sub-region images.
Specifically, the recognition module 30 may be internally provided with an image recognition algorithm, the image recognition algorithm is trained according to the marked region image in advance, so that the image recognition algorithm can recognize whether feature information exists in the image, the image recognition algorithm may use a conventional image recognition model to avoid repeated description of the prior art, so that technical features in the prior art are not explained in detail, and types of the feature information may be replaced according to different use scenes, such as an obstacle, a moving target, an object too close to the target, an object to be detected with a fault, and the like.
The first generating module 40 is connected to the segmenting module 20 and the identifying module 30, respectively, and is configured to generate the prompt information according to the position code corresponding to the abnormal image.
Specifically, the prompt information includes visual information, auditory information, tactile information and other sensory information, for example, the prompt information may be an audio prompt, and a corresponding voice prompt is generated according to the position code corresponding to the abnormal image, so as to remind the user of the occurrence of the abnormality at the position code of the abnormal image.
The second generating module 50 is connected to the identifying module 30 and the collecting module 10, and configured to generate a first AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, and display the first AR image in the main viewing angle of the user.
Illustratively, the environment images in the front direction, the rear direction, the left direction and the right direction are acquired by 4 cameras respectively, after the sub-region images are divided, if an abnormal image exists in the environment picture acquired by the left camera, the environment picture acquired by the left camera is taken as a first environment picture, and an AR image corresponding to the first environment picture is generated and displayed.
The 360-degree full-view-angle AR monitoring system provided by this embodiment divides a 360-degree panoramic environment image into each sub-region image after collecting the environment image and splicing the environment image into 360-degree panoramic environment images, identifies an abnormal image in the sub-region image, displays an AR image of the abnormal image and prompts a user of a position corresponding to the abnormal image, so that the user observes the AR image of the abnormal image only when the abnormal image exists, interference to a main view angle of the user is reduced, and the user can respond to the AR image of the abnormal image and the position corresponding to the abnormal image in time.
In one embodiment, as shown in fig. 7, the present invention further provides AR glasses 100 comprising a processor 110, a display lens 120 and several cameras 130.
Wherein, the plurality of cameras 130 are used for collecting environment images
The processor 110 is connected to the plurality of cameras 130, and is configured to splice environment images collected by the plurality of cameras 130 into a 360-degree panoramic environment image, divide the 360-degree panoramic environment image into a preset number of sub-region images, assign a position code corresponding to each sub-region image, identify an abnormal image in each sub-region image, generate prompt information according to the position code corresponding to the abnormal image, and generate a first AR image corresponding to the first environment image according to a first environment image including a complete abnormal image in the environment image.
The display optics 120 are coupled to the processor 110 for displaying an AR image of the anomaly image in the user's primary perspective.
In particular, the AR glasses 100 may include, but are not limited to, a processor 110, a display lens 120, and a number of cameras 130. Those skilled in the art will appreciate that fig. 7 is merely an example of AR glasses 100, does not constitute a limitation of AR glasses 100, and may include more or fewer components than shown, or some components in combination, or different components, such as: the AR glasses 100 may also include input/output interfaces, display devices, network access devices, communication buses, communication interfaces, and the like. A communication interface and a communication bus, and may further include an input/output interface, wherein the processor 110, the display lens 120, the plurality of cameras 130, the input/output interface and the communication interface complete communication with each other through the communication bus.
In one embodiment, as shown in fig. 8, the wearing device of the AR glasses 100 is configured as a ring, the electric vibrators 140 are respectively disposed at corresponding positions of each sub-region image on the wearing device, the electric vibrators 140 are in contact with a user, the processor 110 generates a vibration instruction according to the prompt information, sends the vibration instruction to the corresponding electric vibrators 140, and controls the electric vibrators 140 corresponding to the vibration instruction to vibrate according to the vibration instruction.
The processor 110 is independently disposed outside the AR glasses 100, and the processor 110 is respectively connected to the display lens 120, the plurality of cameras 130, and the electric vibrators 140 in a communication manner.
Specifically, fig. 8 is an exemplary view of AR glasses, in which 4 cameras 130 and 8 electrodynamic vibrators 140 are provided, and the number is merely an example and does not have a limiting effect.
The AR glasses that this embodiment provided through independently setting up the treater, under the treater with show lens, a plurality of camera and each electronic oscillator communication connection's the condition, when satisfying AR glasses's functional requirement, reduce AR glasses's volume to through setting up a plurality of electronic oscillators, combine sense organ information to realize that the user need not the effect that the abnormal conditions in the head can acquire the surrounding environment.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or recited in detail in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed 360-degree full-view AR monitoring method, system and AR glasses may be implemented in other ways. For example, the above-described embodiments of a 360-degree full-view AR monitoring method, system and AR glasses are merely illustrative, and for example, the division of the modules or units is only a logical division, and there may be other divisions when the actual implementation is performed, for example, a plurality of units or modules may be combined or may be integrated into another system, or some features may be omitted or not executed. In addition, the communication links shown or discussed may be through interfaces, devices or units, or integrated circuits, and may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
It should be noted that the above-mentioned embodiments are only preferred embodiments of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A360-degree full-view AR monitoring method is characterized by comprising the following steps:
acquiring an environment image through a plurality of preset cameras, and splicing the environment image acquired by the plurality of cameras into a 360-degree panoramic environment image;
dividing the 360-degree panoramic environment image into a preset number of subarea images, and giving a position code corresponding to each subarea image;
identifying abnormal images in each subregion image;
generating prompt information according to the position code corresponding to the abnormal image;
and generating a first AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, and displaying the first AR image in a main visual angle of a user.
2. The method of claim 1, wherein the AR monitoring system comprises a monitoring system,
the method is applied to annular wearing equipment, and electric vibrators are arranged in corresponding directions of all the subarea images on the wearing equipment in advance to enable the electric vibrators to be in contact with a user;
generating prompt information according to the position code corresponding to the abnormal image, specifically comprising:
generating prompt information according to the position code corresponding to the abnormal image, and then generating a vibration instruction according to the prompt information;
sending the vibration instruction to the corresponding electric vibrator;
and controlling the corresponding electric vibrator to vibrate according to the vibration instruction.
3. The method of claim 2, wherein the AR monitoring system comprises a monitoring system,
the electric vibrators are arranged at intervals of 45 degrees;
the dividing the 360-degree panoramic environment image into a preset number of subarea images specifically comprises:
and dividing the 360-degree panoramic environment image into 8 sub-area images according to a 45-degree view field angle.
4. The method as claimed in claim 1, wherein after generating the AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, the method further comprises:
generating a second AR image corresponding to the prompt information according to the prompt information;
displaying the first AR image and the second AR image in a primary perspective of the user.
5. The method according to claim 1, wherein after generating a first AR image corresponding to the first environment image according to the first environment image including the complete abnormal image in the environment image, the method further comprises:
generating a third AR image corresponding to the second environment image according to the second environment image which contains the complete main view angle of the user in the environment image;
displaying the first AR image and the third AR image in a primary perspective of the user.
6. The method as claimed in claim 1, wherein after generating the prompt message according to the position code corresponding to the abnormal image, the method further comprises:
generating a third environment image with a preset size by taking the abnormal image as a center in the 360-degree panoramic environment image;
and generating a fourth AR image corresponding to the third environment image, and displaying the fourth AR image in the main view angle of the user.
7. A360 degree full view AR monitoring system, comprising:
the acquisition module is used for acquiring an environment image through a plurality of preset cameras and splicing the environment image acquired by the cameras into a 360-degree panoramic environment image;
the segmentation module is connected with the acquisition module and used for segmenting the 360-degree panoramic environment image into a preset number of subarea images and endowing each subarea image with a corresponding position code;
the identification module is connected with the segmentation module and used for identifying abnormal images in the subarea images;
the first generation module is respectively connected with the segmentation module and the identification module and is used for generating prompt information according to the position code corresponding to the abnormal image;
and the second generation module is respectively connected with the identification module and the acquisition module and is used for generating a first AR image corresponding to the first environment image according to the first environment image containing the complete abnormal image in the environment image and displaying the first AR image in the main visual angle of the user.
8. AR eyewear, comprising:
the cameras are used for acquiring environment images;
the processor is connected with the cameras and used for splicing the environment images collected by the cameras into 360-degree panoramic environment images, dividing the 360-degree panoramic environment images into a preset number of subarea images, giving a position code corresponding to each subarea image, identifying abnormal images in each subarea image, generating prompt information according to the position codes corresponding to the abnormal images, and generating a first AR image corresponding to the first environment image according to the first environment image which contains the complete abnormal images in the environment image;
display optics, coupled to the processor, for displaying an AR image of the anomaly image in a primary perspective of a user.
9. The AR glasses according to claim 8,
the wearing equipment of the AR glasses is arranged in a ring shape, electric vibrators are arranged in the corresponding directions of the sub-area images on the wearing equipment respectively, and the electric vibrators are in contact with the user;
the processor generates a vibration instruction according to the prompt information, sends the vibration instruction to the corresponding electric vibrator, and controls the corresponding electric vibrator to vibrate according to the vibration instruction.
10. The AR glasses according to claim 9,
the processor is independently arranged outside the AR glasses and is respectively in communication connection with the display lenses, the plurality of cameras and the electric vibrators.
CN202210455054.1A 2022-04-24 2022-04-24 360-Degree full-view AR monitoring method and system and AR glasses Active CN114760418B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210455054.1A CN114760418B (en) 2022-04-24 2022-04-24 360-Degree full-view AR monitoring method and system and AR glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210455054.1A CN114760418B (en) 2022-04-24 2022-04-24 360-Degree full-view AR monitoring method and system and AR glasses

Publications (2)

Publication Number Publication Date
CN114760418A true CN114760418A (en) 2022-07-15
CN114760418B CN114760418B (en) 2024-06-18

Family

ID=82333335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210455054.1A Active CN114760418B (en) 2022-04-24 2022-04-24 360-Degree full-view AR monitoring method and system and AR glasses

Country Status (1)

Country Link
CN (1) CN114760418B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114937247A (en) * 2022-07-21 2022-08-23 四川金信石信息技术有限公司 Transformer substation monitoring method and system based on deep learning and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018067773A (en) * 2016-10-18 2018-04-26 キヤノン株式会社 Imaging device, control method thereof, program, and storage medium
US20180143433A1 (en) * 2016-11-18 2018-05-24 Seiko Epson Corporation Head mounted display, control method thereof, and computer program
CN108293108A (en) * 2015-11-27 2018-07-17 三星电子株式会社 Electronic device for showing and generating panoramic picture and method
CN110197569A (en) * 2018-11-19 2019-09-03 广东小天才科技有限公司 Safety monitoring method based on wearable device and wearable device
US20200084516A1 (en) * 2017-04-03 2020-03-12 Electronics And Telecommunications Research Institute Device and method for processing high-definition 360-degree vr image
CN211149677U (en) * 2020-02-07 2020-07-31 中国人民解放军空军特色医学中心 Head-wearing type double-channel direction reminding device for listening and touching

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108293108A (en) * 2015-11-27 2018-07-17 三星电子株式会社 Electronic device for showing and generating panoramic picture and method
JP2018067773A (en) * 2016-10-18 2018-04-26 キヤノン株式会社 Imaging device, control method thereof, program, and storage medium
US20180143433A1 (en) * 2016-11-18 2018-05-24 Seiko Epson Corporation Head mounted display, control method thereof, and computer program
US20200084516A1 (en) * 2017-04-03 2020-03-12 Electronics And Telecommunications Research Institute Device and method for processing high-definition 360-degree vr image
CN110197569A (en) * 2018-11-19 2019-09-03 广东小天才科技有限公司 Safety monitoring method based on wearable device and wearable device
CN211149677U (en) * 2020-02-07 2020-07-31 中国人民解放军空军特色医学中心 Head-wearing type double-channel direction reminding device for listening and touching

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114937247A (en) * 2022-07-21 2022-08-23 四川金信石信息技术有限公司 Transformer substation monitoring method and system based on deep learning and electronic equipment

Also Published As

Publication number Publication date
CN114760418B (en) 2024-06-18

Similar Documents

Publication Publication Date Title
Fisher et al. Virtual interface environment workstations
JP6904254B2 (en) Surgical controls, surgical controls, and programs
JP2011205358A (en) Head-mounted display device
CN106325511A (en) Virtual reality realizing system
CN101743567A (en) Virtual interactive presence systems and methods
CN108259883B (en) Image processing method, head-mounted display, and readable storage medium
CN114760418B (en) 360-Degree full-view AR monitoring method and system and AR glasses
US7377650B2 (en) Projection of synthetic information
JP5963006B2 (en) Image conversion apparatus, camera, video system, image conversion method, and recording medium recording program
JP2020184292A (en) Dispersion-type target tracking system
US20230239457A1 (en) System and method for corrected video-see-through for head mounted displays
KR101977635B1 (en) Multi-camera based aerial-view 360-degree video stitching and object detection method and device
CN108702482A (en) Information processing equipment, information processing system, information processing method and program
JP2010217984A (en) Image detector and image detection method
CN115836246A (en) Multi-pinhole camera and image recognition system
JP2017046233A (en) Display device, information processor, and control method of the same
CN110267017A (en) A kind of system and method managed and show camera video
JP2741063B2 (en) Wide-field display device
CN111722401A (en) Display device, display control method, and display system
KR102357262B1 (en) Communication apparatus and method of video call therefor
CN114882742A (en) Ear endoscope operation simulation teaching method, system, equipment and medium based on VR technology
JPH03226198A (en) Stereoscopic picture display device
EP4343701A1 (en) Image generation method, apparatus and system, and computer-readable storage medium
JPWO2017098999A1 (en) Information processing apparatus, information processing system, information processing apparatus control method, and computer program
US20240299122A1 (en) Surgical imaging system with selective view

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant