CN114415828A - Method and device for remotely checking vehicle based on augmented reality - Google Patents

Method and device for remotely checking vehicle based on augmented reality Download PDF

Info

Publication number
CN114415828A
CN114415828A CN202111620206.0A CN202111620206A CN114415828A CN 114415828 A CN114415828 A CN 114415828A CN 202111620206 A CN202111620206 A CN 202111620206A CN 114415828 A CN114415828 A CN 114415828A
Authority
CN
China
Prior art keywords
data stream
augmented reality
action
vehicle
reality data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111620206.0A
Other languages
Chinese (zh)
Inventor
彭飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 58 Information Technology Co Ltd
Original Assignee
Beijing 58 Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 58 Information Technology Co Ltd filed Critical Beijing 58 Information Technology Co Ltd
Priority to CN202111620206.0A priority Critical patent/CN114415828A/en
Publication of CN114415828A publication Critical patent/CN114415828A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a method and a device for remotely checking a vehicle based on augmented reality, which relate to the field of Internet, and the method comprises the following steps: a field device collects a world environment data stream, generates a first augmented reality data stream based on the world environment data stream, and sends the first augmented reality data stream to a remote device, so that the remote device generates a user cooperation data stream based on the first augmented reality data stream; and receiving the user cooperation data stream, then executing a corresponding action based on the user cooperation data stream, collecting a second augmented reality data stream when the corresponding action is executed on the vehicle, and sending the second augmented reality data stream to the remote device. Therefore, a user can remotely check a real vehicle instead of a virtual vehicle model and can perform real operation on the vehicle, so that the user experience is improved, the whole process does not need manual participation, and the operation cost is greatly saved.

Description

Method and device for remotely checking vehicle based on augmented reality
Technical Field
The invention relates to the technical field of internet, in particular to a method and a device for remotely checking a vehicle based on augmented reality.
Background
The products of the vehicle can not be displayed, on average, the consumers start their car purchasing journey three months before signing on the car purchasing contract, and the online digital consumption scene extends through the whole car consuming customer journey by combining the consumption habits of the current vehicle consumers.
One of the existing remote vehicle watching modes is AR (Augmented Reality) vehicle watching. Specifically, the AR camera is turned on, and a plane is found in the real world, preventing a pre-designed 3D model of the vehicle, as shown in fig. 1. Additionally, some event interactions are added to the 3D model, such as clicking on the AR vehicle model door opens the door, clicking on the trunk opens the model's vehicle trunk, as shown in fig. 2.
However, the essence of this approach is to show only one 3D model, and the user experience is not high.
Another remote vehicle watching mode is live broadcasting vehicle watching, video connection is established between a remote place and a site, and a remote user can watch a real vehicle on the site through real-time video.
However, this solution requires the anchor to move the video camera through the line designed by the anchor to show the vehicle on site to the remote user. The mode has high labor cost, depends on the field operation of the anchor, and is difficult to meet the personalized car watching requirements of different users, such as dynamically changing a camera according to the requirements of remote users.
Disclosure of Invention
In view of the above, embodiments of the present invention are proposed to provide a method for augmented reality-based remote viewing of a vehicle and a corresponding apparatus for augmented reality-based remote viewing of a vehicle that overcome or at least partially solve the above problems.
In order to solve the above problems, an embodiment of the present invention discloses a method for remotely viewing a vehicle based on augmented reality, including:
collecting a world environment data stream, and generating a first augmented reality data stream based on the world environment data stream;
transmitting the first augmented reality data stream to the remote device to cause the remote device to generate a user collaboration data stream based on the first augmented reality data stream;
receiving the user collaboration data stream;
executing a corresponding action based on the user collaboration data stream, and collecting a second augmented reality data stream when the corresponding action is executed on the vehicle;
transmitting the second augmented reality data stream to the remote device.
In one or more embodiments, the performing a corresponding action based on the user collaboration data stream and collecting a second augmented reality data stream while performing the corresponding action on the vehicle includes:
identifying an action parameter in the user collaboration data stream;
if the action parameter is the opening/closing of the vehicle door, opening/closing the corresponding vehicle door according to the vehicle door identifier in the action parameter value in the user cooperation data stream;
and acquiring the augmented reality data stream when the corresponding vehicle door is opened/closed.
In one or more embodiments, the performing a corresponding action based on the user collaboration data stream and collecting a second augmented reality data stream while performing the corresponding action on the vehicle includes:
identifying an action parameter in the user collaboration data stream;
if the action parameter is to open/close the trunk, opening/closing the trunk; an action parameter value corresponding to the action parameter in the user cooperation data stream is null;
acquiring the augmented reality data stream when the trunk is opened/closed.
In one or more embodiments, the performing a corresponding action based on the user collaboration data stream and collecting a second augmented reality data stream while performing the corresponding action on the vehicle includes:
identifying an action parameter in the user collaboration data stream;
if the action parameter is a moving lens, moving the lens according to a moving direction value and a moving distance value in the action parameter value in the user cooperation data stream;
and acquiring the augmented reality data stream when the lens is moved.
In one or more embodiments, before the collecting the world environment data stream, further comprising:
and setting the augmented reality terminal equipment in the remote equipment to be in a multi-user operation mode.
In one or more embodiments, the sending the first augmented reality data stream to the remote device includes:
sending the first augmented reality data stream to the remote device using a preset frame rate;
the sending the second augmented reality data stream to the remote device includes:
and sending the second augmented reality data stream to the remote device by adopting a preset frame rate.
Correspondingly, the embodiment of the invention also discloses a method for remotely viewing the vehicle based on the augmented reality, which comprises the following steps:
receiving a first augmented reality data stream sent by a field device;
generating a corresponding user collaboration data stream in response to a gesture action performed with respect to a screen corresponding to the first augmented reality data stream; the user collaboration data stream includes action parameters and action parameter values;
sending the user collaboration data stream to the field device; so that the field device acquires a second augmented reality data stream based on the user collaboration data stream;
receiving the second augmented reality data stream;
and displaying the second augmented reality data stream.
In one or more embodiments, the generating corresponding user collaboration data in response to the gesture action performed with respect to the screen to which the first augmented reality data corresponds includes:
determining coordinates for executing the gesture in a picture corresponding to the first augmented reality data stream, and a corresponding area of the coordinates in the picture corresponding to the first augmented reality data stream;
identifying the gesture action to obtain an action parameter and an action parameter value corresponding to the gesture action in the corresponding area;
generating a user collaboration data stream based on the action parameters and the action parameter values.
Correspondingly, the embodiment of the invention discloses a device for remotely checking vehicles based on augmented reality, which comprises:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a world environment data stream and generating a first augmented reality data stream based on the world environment data stream;
a first sending module to send the first augmented reality data stream to the remote device to cause the remote device to generate a user collaboration data stream based on the first augmented reality data stream;
a first receiving module, configured to receive the user collaboration data stream;
a processing module for executing a corresponding action based on the user collaboration data stream and collecting a second augmented reality data stream when the corresponding action is executed on the vehicle;
the first sending module is further configured to send the second augmented reality data stream to the remote device.
In one or more embodiments, the processing module includes:
the first identification submodule is used for identifying action parameters in the user collaboration data stream;
the execution submodule is used for opening/closing the corresponding vehicle door according to the vehicle door identifier in the action parameter value in the user cooperation data stream if the action parameter is the vehicle door opening/closing;
and the acquisition submodule is used for acquiring the augmented reality data stream when the corresponding vehicle door is opened/closed.
In one or more embodiments, the processing module includes:
the first identification submodule is further used for identifying action parameters in the user cooperation data stream;
the execution submodule is further used for opening/closing the trunk if the action parameter is the opening/closing of the trunk; an action parameter value corresponding to the action parameter in the user cooperation data stream is null;
the acquisition submodule is also used for acquiring the augmented reality data stream when the trunk is opened/closed.
In one or more embodiments, the processing module includes:
the first identification submodule is further used for identifying action parameters in the user cooperation data stream;
the execution submodule is further configured to, if the action parameter is a moving lens, move the lens according to a moving direction value and a moving distance value in the action parameter value in the user cooperation data stream;
the acquisition submodule is also used for acquiring the augmented reality data stream when the lens is moved.
In one or more embodiments, further comprising:
and the setting module is used for setting the augmented reality terminal equipment in the remote equipment into a multi-user operation mode before the world environment data stream is collected.
In one or more embodiments, the first sending module is specifically configured to:
sending the first augmented reality data stream to the remote device using a preset frame rate;
and the number of the first and second groups,
and sending the second augmented reality data stream to the remote device by adopting a preset frame rate.
Correspondingly, the embodiment of the invention also discloses a device for remotely checking the vehicle based on the augmented reality, which comprises:
the second receiving module is used for receiving a first augmented reality data stream sent by the field device;
a generating module, configured to generate a corresponding user collaboration data stream in response to a gesture action performed on a screen corresponding to the first augmented reality data stream; the user collaboration data stream includes action parameters and action parameter values;
a second sending module, configured to send the user collaboration data stream to the field device; so that the field device acquires a second augmented reality data stream based on the user collaboration data stream;
the second receiving module is further configured to receive the second augmented reality data stream;
a display module to display the second augmented reality data stream.
In one or more embodiments, the generating module includes:
a determining submodule, configured to determine a coordinate of the gesture performed in a picture corresponding to the first augmented reality data stream, and a corresponding area of the coordinate in the picture corresponding to the first augmented reality data stream;
the second recognition submodule is used for recognizing the gesture action to obtain action parameters and action parameter values corresponding to the gesture action in the corresponding area;
a generating sub-module for generating a user collaboration data stream based on the action parameter and the action parameter value.
Correspondingly, the embodiment of the invention discloses an electronic device, which comprises: a processor, a memory and a computer program stored on and executable on the memory, the computer program, when executed by the processor, implementing the steps of the above-described augmented reality based remote viewing vehicle method embodiment.
Accordingly, embodiments of the present invention disclose a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the above-described augmented reality-based remote vehicle viewing method embodiments.
The embodiment of the invention has the following advantages:
a field device collects a world environment data stream, generates a first augmented reality data stream based on the world environment data stream, and sends the first augmented reality data stream to a remote device, so that the remote device generates a user cooperation data stream based on the first augmented reality data stream; and receiving the user cooperation data stream, then executing a corresponding action based on the user cooperation data stream, collecting a second augmented reality data stream when the corresponding action is executed on the vehicle, and sending the second augmented reality data stream to the remote device. Therefore, a user can remotely check a real vehicle instead of a virtual vehicle model and can perform real operation on the vehicle, so that the user experience is improved, the whole process does not need manual participation, and the operation cost is greatly saved.
Drawings
FIG. 1 is a schematic diagram I of a conventional AR seeing car;
FIG. 2 is a schematic diagram II of a conventional AR seeing car;
FIG. 3 is a flowchart illustrating steps of a first embodiment of an augmented reality-based method for remotely viewing a vehicle according to the present invention;
FIGS. 4A-4B are schematic layouts of field devices of the present invention;
FIG. 5 is a flowchart illustrating steps of a second embodiment of an augmented reality-based method for remotely viewing a vehicle according to the present invention;
FIG. 6 is a first block diagram illustrating an embodiment of an apparatus for remote viewing a vehicle based on augmented reality according to the present invention;
fig. 7 is a block diagram of an embodiment of an apparatus for remotely viewing a vehicle based on augmented reality according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
One of the core ideas of the embodiment of the present invention is that a field device collects a world environment data stream, generates a first augmented reality data stream based on the world environment data stream, and sends the first augmented reality data stream to a remote device, so that the remote device generates a user cooperation data stream based on the first augmented reality data stream; and receiving the user cooperation data stream, then executing a corresponding action based on the user cooperation data stream, collecting a second augmented reality data stream when the corresponding action is executed on the vehicle, and sending the second augmented reality data stream to the remote device. Therefore, a user can remotely check a real vehicle instead of a virtual vehicle model and can perform real operation on the vehicle, so that the user experience is improved, the whole process does not need manual participation, and the operation cost is greatly saved.
Referring to fig. 3, a flowchart illustrating steps of a first embodiment of an augmented reality-based method for remotely viewing a vehicle according to the present invention is shown, wherein the method is applied to a field device, which may be a device located in the same space as the vehicle, and the field device includes, but is not limited to, an AR terminal and a mechanical device.
The AR terminal may be a terminal having an AR related infrastructure function. The AR basic functions comprise automatic identification of real world objects under an AR terminal, addition and interaction of virtual models in the real world and the like. Such as AR cameras, head-mounted AR hardware, etc. In the embodiment of the present invention, the specific form of the AR terminal is not limited, and may be set according to actual requirements, which is not limited in the embodiment of the present invention.
The mechanical device is used for executing corresponding actions on the vehicle according to the action commands of the remote equipment, and the actions include but are not limited to: open/close doors, open/close trunk, start, refuel, shift, shut down, etc.
Further, the number of field devices may be one or more. For example, with the field device layout shown in fig. 4A, 4 field devices are deployed around the vehicle so that the user can switch the viewing angle in the remote device to view the vehicle from different angles. For another example, as shown in the field device layout shown in fig. 4B, 1 field device is deployed in a vehicle, and the purpose of viewing the vehicle from different angles can also be achieved by operating the field devices to move around the vehicle for one circle. Therefore, in practical applications, the number of field devices may be set according to practical requirements, and the embodiment of the present invention is not limited thereto. For convenience of description, the embodiment of the present invention is described in detail by taking 1 field device as an example.
Further, before establishing a connection between the field device and the remote device, the AR terminal in the field device may be set to a multi-user operation mode in which the field device can share the AR terminal with the remote device, and throughout the sharing, in addition to data transmission under the AR terminal in real time, the operation of the remote device under the AR terminal can be transmitted to the field device, and then a vehicle in the real world of the field can be operated.
It should be noted that, in the embodiment of the present invention, the AR terminal is different from the common terminal in that the AR terminal encapsulates the recognition capability of the real world object under the terminal, can label the coordinates of the real world object, and can also place a certain virtual object on the real world object based on the coordinates. This cannot be achieved if the remote device and the field device are transmitting only a normal data stream.
The method specifically comprises the following steps:
step 301, collecting a world environment data stream, and generating a first augmented reality data stream based on the world environment data stream;
specifically, when the AR device in the field device is set to the multi-user operation mode and is in the working state, the world environment data stream may be collected, where the world environment data stream includes but is not limited to: vehicle information in the real environment, AR three-dimensional world coordinate system data, and the like. And then, the world environment data stream is packaged to obtain augmented reality data (for convenience of distinguishing, the augmented reality data is recorded as first augmented reality data).
Step 302, sending the first augmented reality data stream to the remote device, so that the remote device generates a user collaboration data stream based on the first augmented reality;
specifically, after the first augmented reality data is obtained, the first augmented reality data is sent to the remote device, and the remote device receives the first augmented reality data stream to generate the user cooperation data stream. The specific generation manner of the user collaboration data stream is detailed in the rear.
Step 303, receiving the user cooperation data stream;
specifically, the field device and the remote device can be connected point-to-point without passing through an intermediate device, so that the field device can directly receive the user cooperation data stream transmitted by the remote device. Wherein, the user cooperation data flow includes but is not limited to: the gesture control method comprises the action parameters of the gesture action of the user and the action parameter values corresponding to the action parameters.
Where the action parameter may be an action that the field device needs to perform, the action parameter value may be a specific amount performed. For example, the motion parameter may be movement and the motion parameter values may be a direction value and a distance value.
Step 304, executing a corresponding action based on the user cooperation data stream, and collecting a second augmented reality data stream when the corresponding action is executed on the vehicle;
after the field device acquires the user cooperation data stream, corresponding actions can be executed based on the action parameters and the action parameter values, and meanwhile, the augmented reality data stream in the action execution process is collected in real time.
In an embodiment of the present invention, the executing a corresponding action based on the user cooperation data stream and collecting a second augmented reality data stream when the corresponding action is executed on the vehicle includes:
identifying an action parameter in the user collaboration data stream;
if the action parameter is the opening/closing of the vehicle door, opening/closing the corresponding vehicle door according to the vehicle door identifier in the action parameter value in the user cooperation data stream;
and acquiring the augmented reality data stream when the corresponding vehicle door is opened/closed.
Specifically, after receiving the user cooperation data stream, the motion parameters in the user cooperation data stream may be identified, if the motion parameters are to open/close a vehicle door, a vehicle door identifier in the motion parameter values is obtained, for example, left front, left back, right front, and right back, then the vehicle door corresponding to the vehicle door identifier is opened/closed by a mechanical device in the field device, and simultaneously, all images in the vehicle door opening/closing process are collected in real time from the time when the motion parameters and the motion parameter values are received.
In an embodiment of the present invention, the executing a corresponding action based on the action parameter and the action parameter value, and collecting an augmented reality data stream when the corresponding action is executed on the vehicle includes:
identifying an action parameter in the user collaboration data stream;
if the action parameter is to open/close the trunk, opening/closing the trunk; an action parameter value corresponding to the action parameter in the user cooperation data stream is null;
acquiring the augmented reality data stream when the trunk is opened/closed.
Specifically, after the user cooperation data stream is received, the action parameters in the user cooperation data stream can be identified, if the action parameters are opening/closing of the trunk, the trunk is opened/closed through a mechanical device in the field device, and all images in the process of opening/closing the trunk are collected in real time from the moment that the action parameters and the action parameter values are received.
Since the trunk is unique, the operation parameter value for the trunk may be set to null. Of course, in practical applications, the value of the action parameter may be set to other values according to actual requirements without affecting the action performed by the field device, which is not limited in the embodiment of the present invention.
In an embodiment of the present invention, the executing a corresponding action based on the user cooperation data stream and collecting a second augmented reality data stream when the corresponding action is executed on the vehicle includes:
identifying an action parameter in the user collaboration data stream;
if the action parameter is a moving lens, moving the lens according to a moving direction value and a moving distance value in the action parameter value in the user cooperation data stream;
and acquiring the augmented reality data stream when the lens is moved.
Specifically, after receiving the user collaboration data stream, the action parameters in the user collaboration data stream may be identified, and if the action parameters are moving lenses, a moving direction value and a moving distance value in the action parameter values are obtained, and then the lenses (that is, the AR terminals in the field devices) are moved according to the moving direction value and the moving distance value, and simultaneously, all images in the process of opening/closing the vehicle doors are collected in real time from the time when the action parameters and the action parameter values are received.
Step 305, sending the second augmented reality data stream to the remote device.
After the second augmented reality data stream is collected, the second augmented reality data stream is directly sent to the remote equipment, and the remote equipment can display the second augmented reality data stream to the user.
In an embodiment of the present invention, the sending the first augmented reality data stream to the remote device includes:
sending the first augmented reality data stream to the remote device using a preset frame rate;
the sending the second augmented reality data stream to the remote device includes:
and sending the second augmented reality data stream to the remote device by adopting a preset frame rate.
Specifically, in order to ensure real-time performance of data transmission between the remote device and the field device, the augmented reality data stream needs to be transmitted at a very high frame rate, so when the first augmented reality data stream and the second augmented reality data stream are transmitted, the first augmented reality data stream and the second augmented reality data stream can be transmitted at a preset frame rate.
In the embodiment of the invention, a field device acquires a world environment data stream, generates a first augmented reality data stream based on the world environment data stream, and sends the first augmented reality data stream to a remote device, so that the remote device generates a user cooperation data stream based on the first augmented reality data stream; and receiving the user cooperation data stream, then executing a corresponding action based on the user cooperation data stream, collecting a second augmented reality data stream when the corresponding action is executed on the vehicle, and sending the second augmented reality data stream to the remote device. Therefore, a user can remotely check a real vehicle instead of a virtual vehicle model and can perform real operation on the vehicle, so that the user experience is improved, the whole process does not need manual participation, and the operation cost is greatly saved.
Referring to fig. 5, a flowchart illustrating steps of a second embodiment of an augmented reality-based remote vehicle viewing method according to the present invention is shown, where the method can be applied to a remote device, and the remote device can have the following features:
(1) on a hardware architecture, a device has a central processing unit, a memory, an input unit and an output unit, that is, the device is often a microcomputer device having a communication function. In addition, various input modes such as a keyboard, a mouse, a touch screen, a microphone, a camera and the like can be provided, and input can be adjusted as required. Meanwhile, the equipment often has a plurality of output modes, such as a telephone receiver, a display screen and the like, and can be adjusted according to needs;
(2) on a software system, the device must have an operating system, such as Windows Mobile, Symbian, Palm, Android, iOS, and the like. Meanwhile, the operating systems are more and more open, and personalized application programs developed based on the open operating system platforms are infinite, such as a communication book, a schedule, a notebook, a calculator, various games and the like, so that the requirements of personalized users are met to a great extent;
(3) in terms of communication capacity, the device has flexible access mode and high-bandwidth communication performance, and can automatically adjust the selected communication mode according to the selected service and the environment, thereby being convenient for users to use. The device may support 3GPP (3rd Generation Partnership Project), 4GPP (4rd Generation Partnership Project), 5GPP (5rd Generation Partnership Project), LTE (Long Term Evolution), WIMAX (World Interoperability for Microwave Access), mobile communication based on TCP/IP (Transmission Control Protocol/Internet Protocol), UDP (User data Protocol, User Datagram Protocol) Protocol, computer network communication based on TCP/IP Protocol, and short-range wireless Transmission based on bluetooth and infrared Transmission standards, not only supporting voice services, but also supporting various wireless data services;
(4) in the aspect of function use, the equipment focuses more on humanization, individuation and multi-functionalization. With the development of computer technology, devices enter a human-centered mode from a device-centered mode, and the embedded computing, control technology, artificial intelligence technology, biometric authentication technology and the like are integrated, so that the human-oriented purpose is fully embodied. Due to the development of software technology, the equipment can be adjusted and set according to individual requirements, and is more personalized. Meanwhile, the device integrates a plurality of software and hardware, and the function is more and more powerful.
The method specifically comprises the following steps:
step 501, receiving a first augmented reality data stream sent by a field device;
specifically, after the field device establishes a connection with the remote device and the AR in the field device is set to the multi-user operation mode, the remote device may receive a first augmented reality data stream sent by the field device.
Step 502, generating a corresponding user collaboration data stream in response to a gesture action performed on a screen corresponding to the first augmented reality data stream; the user collaboration data stream includes action parameters and action parameter values;
when the remote device detects a gesture action executed by the user on the displayed picture corresponding to the first augmented reality data stream, the action parameter and the action parameter value corresponding to the gesture action can be generated, and the user collaboration data stream is generated.
In an embodiment of the present invention, the generating, in response to a gesture action performed on a screen corresponding to the first augmented reality data, corresponding user collaboration data includes:
determining coordinates for executing the gesture in a picture corresponding to the first augmented reality data stream, and a corresponding area of the coordinates in the picture corresponding to the first augmented reality data stream;
identifying the gesture action to obtain an action parameter and an action parameter value corresponding to the gesture action in the corresponding area;
generating a user collaboration data stream based on the action parameters and the action parameter values.
Specifically, when the gesture motion is detected, the coordinates of the gesture motion performed in the screen corresponding to the first augmented reality data stream and the corresponding region of the coordinates in the screen corresponding to the first augmented reality data stream, that is, the gesture motion that the user performed for which region in the screen corresponding to the first augmented reality data stream is determined, may be determined. And then recognizing the gesture action to obtain an action parameter and an action parameter value corresponding to the gesture action in the corresponding area. The corresponding relationship between the gesture and the corresponding area, the action parameter, and the action parameter value may be as shown in table 1:
TABLE 1
Figure BDA0003437278690000131
Figure BDA0003437278690000141
It should be noted that, in the embodiment of the present invention, a mapping relationship between a sliding value on a screen and an actual value may be preset, and when a gesture is sliding, zooming in, zooming out, or the like, a specific parameter value of an action performed by a field device may be calculated according to the mapping relationship. For example, assuming 1 pixel maps 2 centimeters, when the user slides 10 pixels across the screen of the remote terminal, the field device moves 20 centimeters correspondingly.
Alternatively, the space coordinates before the field device is moved and the space coordinates after the field device is moved may be calculated from the coordinates before the field device is slid and the coordinates after the field device is slid, and then the field device may be moved to the space coordinates after the field device is moved.
Of course, the field device may be moved in other ways besides the above-mentioned ways, and may be set according to actual requirements in practical applications, which is not limited in the embodiment of the present invention.
Further, when the vehicle door and the trunk are identified and clicked, a trained learning model can be adopted for identification. The learning model can identify a door region and a trunk region in a picture corresponding to the first augmented reality data stream, and when a gesture motion is detected, whether the coordinate of the gesture motion is in the door region or the trunk region is detected.
Step 502, sending the user collaboration data stream to the field device; so that the field device acquires a second augmented reality data stream based on the user collaboration data stream;
step 503, receiving a second augmented reality data stream sent by the field device;
after the action parameters and the action parameter values are determined, the action parameters and the action parameter values are sent to the field equipment, so that the field equipment executes corresponding actions according to the action parameters and the action parameter values, in the action executing process, a second augmented reality data stream is acquired, and the augmented reality data stream is sent to the remote equipment. Specifically, reference may be made to step 101 to step 105 when the field device executes the above steps, and details are not described herein in order to avoid repetition.
Step 504, displaying the augmented reality data stream.
And the remote equipment receives the augmented reality data stream and then displays the augmented reality data stream to the user.
It should be noted that, in order to ensure the user experience, the transmission frequency may be shortened when the remote device sends the action parameters and the action parameter values to the field device. In this way, the user can synchronously execute the action parameter and the action parameter value in the process of executing a gesture action in the remote equipment in real time. For example, the user slides once from left to right in the screen of the remote device, and in order to avoid sending the action parameters and the action parameter values after the sliding is stopped, the action parameters and the action parameter values may be sent at the fastest transmission frequency in the sliding process of the user.
Correspondingly, the user also receives and displays the augmented reality data stream sent by the field device in real time during the gesture action, so that the picture corresponding to the first augmented reality data stream is also updated in real time by the augmented reality data stream, and the augmented reality data stream received from the field device is not displayed after the gesture action performed by the user is finished.
In the embodiment of the invention, a remote device receives a first augmented reality data stream sent by a field device; generating a corresponding user collaboration data stream in response to a gesture action performed with respect to a screen corresponding to the first augmented reality data stream; the user collaboration data stream includes action parameters and action parameter values; sending the user collaboration data stream to the field device; so that the field device acquires a second augmented reality data stream based on the user collaboration data stream; receiving the second augmented reality data stream; and displaying the second augmented reality data stream. Therefore, a user can remotely check a real vehicle instead of a virtual vehicle model and can perform real operation on the vehicle, so that the user experience is improved, the whole process does not need manual participation, and the operation cost is greatly saved.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 6, a block diagram of a first embodiment of an apparatus for remotely viewing a vehicle based on augmented reality according to the present invention is shown, which may specifically include the following modules:
the acquisition module 601 is configured to acquire a world environment data stream and generate a first augmented reality data stream based on the world environment data stream;
a first sending module 602, configured to send the first augmented reality data stream to the remote device, so that the remote device generates a user collaboration data stream based on the first augmented reality data stream;
a first receiving module 603, configured to receive the user collaboration data stream;
a processing module 604, configured to execute a corresponding action based on the user collaboration data stream, and collect a second augmented reality data stream when the corresponding action is executed on the vehicle;
the first sending module is further configured to send the second augmented reality data stream to the remote device.
In an embodiment of the present invention, the processing module includes:
the first identification submodule is used for identifying action parameters in the user collaboration data stream;
the execution submodule is used for opening/closing the corresponding vehicle door according to the vehicle door identifier in the action parameter value in the user cooperation data stream if the action parameter is the vehicle door opening/closing;
and the acquisition submodule is used for acquiring the augmented reality data stream when the corresponding vehicle door is opened/closed.
In an embodiment of the present invention, the processing module includes:
the first identification submodule is further used for identifying action parameters in the user cooperation data stream;
the execution submodule is further used for opening/closing the trunk if the action parameter is the opening/closing of the trunk; an action parameter value corresponding to the action parameter in the user cooperation data stream is null;
the acquisition submodule is also used for acquiring the augmented reality data stream when the trunk is opened/closed.
In an embodiment of the present invention, the processing module includes:
the first identification submodule is further used for identifying action parameters in the user cooperation data stream;
the execution submodule is further configured to, if the action parameter is a moving lens, move the lens according to a moving direction value and a moving distance value in the action parameter value in the user cooperation data stream;
the acquisition submodule is also used for acquiring the augmented reality data stream when the lens is moved.
In the embodiment of the present invention, the method further includes:
and the setting module is used for setting the augmented reality terminal equipment in the remote equipment into a multi-user operation mode before the world environment data stream is collected.
In this embodiment of the present invention, the first sending module is specifically configured to:
sending the first augmented reality data stream to the remote device using a preset frame rate;
and the number of the first and second groups,
and sending the second augmented reality data stream to the remote device by adopting a preset frame rate.
Referring to fig. 7, a block diagram of a second embodiment of the augmented reality-based device for remotely viewing a vehicle according to the present invention is shown, and specifically includes the following modules:
a second receiving module 701, configured to receive a first augmented reality data stream sent by a field device;
a generating module 702, configured to generate a corresponding user collaboration data stream in response to a gesture action performed on a screen corresponding to the first augmented reality data stream; the user collaboration data stream includes action parameters and action parameter values;
a second sending module 703, configured to send the user cooperation data stream to the field device; so that the field device acquires a second augmented reality data stream based on the user collaboration data stream;
the second receiving module is further configured to receive the second augmented reality data stream;
a display module 704 configured to display the second augmented reality data stream.
In an embodiment of the present invention, the generating module includes:
a determining submodule, configured to determine a coordinate of the gesture performed in a picture corresponding to the first augmented reality data stream, and a corresponding area of the coordinate in the picture corresponding to the first augmented reality data stream;
the second recognition submodule is used for recognizing the gesture action to obtain action parameters and action parameter values corresponding to the gesture action in the corresponding area;
a generating sub-module for generating a user collaboration data stream based on the action parameter and the action parameter value.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an electronic device, including:
the method comprises a processor, a memory and a computer program which is stored on the memory and can run on the processor, wherein when the computer program is executed by the processor, each process of the embodiment of the method for remotely checking the vehicle based on the augmented reality is realized, the same technical effect can be achieved, and the method is not repeated herein for avoiding repetition.
The embodiment of the invention also provides a computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program realizes each process of the embodiment of the method for remotely checking the vehicle based on the augmented reality, can achieve the same technical effect, and is not repeated here to avoid repetition.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method for remotely viewing the vehicle based on the augmented reality and the device for remotely viewing the vehicle based on the augmented reality provided by the invention are described in detail, specific examples are applied in the text to explain the principle and the implementation mode of the invention, and the description of the above examples is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (12)

1. An augmented reality based method of remotely viewing a vehicle, the method comprising:
collecting a world environment data stream, and generating a first augmented reality data stream based on the world environment data stream;
transmitting the first augmented reality data stream to the remote device to cause the remote device to generate a user collaboration data stream based on the first augmented reality data stream;
receiving the user collaboration data stream;
executing a corresponding action based on the user collaboration data stream, and collecting a second augmented reality data stream when the corresponding action is executed on the vehicle;
transmitting the second augmented reality data stream to the remote device.
2. The augmented reality-based remote viewing vehicle method of claim 1, wherein the performing a corresponding action based on the user collaboration data stream and capturing a second augmented reality data stream while performing the corresponding action on the vehicle comprises:
identifying an action parameter in the user collaboration data stream;
if the action parameter is the opening/closing of the vehicle door, opening/closing the corresponding vehicle door according to the vehicle door identifier in the action parameter value in the user cooperation data stream;
and acquiring the augmented reality data stream when the corresponding vehicle door is opened/closed.
3. The augmented reality-based remote viewing vehicle method of claim 1, wherein the performing a corresponding action based on the user collaboration data stream and capturing a second augmented reality data stream while performing the corresponding action on the vehicle comprises:
identifying an action parameter in the user collaboration data stream;
if the action parameter is to open/close the trunk, opening/closing the trunk; an action parameter value corresponding to the action parameter in the user cooperation data stream is null;
acquiring the augmented reality data stream when the trunk is opened/closed.
4. The augmented reality-based remote viewing vehicle method of claim 1, wherein the performing a corresponding action based on the user collaboration data stream and capturing a second augmented reality data stream while performing the corresponding action on the vehicle comprises:
identifying an action parameter in the user collaboration data stream;
if the action parameter is a moving lens, moving the lens according to a moving direction value and a moving distance value in the action parameter value in the user cooperation data stream;
and acquiring the augmented reality data stream when the lens is moved.
5. The augmented reality-based method of remotely viewing a vehicle as recited in claim 1, further comprising, prior to the collecting the stream of world environment data:
and setting the augmented reality terminal equipment in the remote equipment to be in a multi-user operation mode.
6. The augmented reality-based remote viewing vehicle method of claim 1, wherein the sending the first augmented reality data stream to the remote device comprises:
sending the first augmented reality data stream to the remote device using a preset frame rate;
the sending the second augmented reality data stream to the remote device includes:
and sending the second augmented reality data stream to the remote device by adopting a preset frame rate.
7. An augmented reality based method of remotely viewing a vehicle, the method comprising:
receiving a first augmented reality data stream sent by a field device;
generating a corresponding user collaboration data stream in response to a gesture action performed with respect to a screen corresponding to the first augmented reality data stream; the user collaboration data stream includes action parameters and action parameter values;
sending the user collaboration data stream to the field device; so that the field device acquires a second augmented reality data stream based on the user collaboration data stream;
receiving the second augmented reality data stream;
and displaying the second augmented reality data stream.
8. The augmented reality-based remote viewing vehicle method of claim 7, wherein generating corresponding user collaboration data in response to a gesture action performed with respect to a screen to which the first augmented reality data corresponds comprises:
determining coordinates for executing the gesture in a picture corresponding to the first augmented reality data stream, and a corresponding area of the coordinates in the picture corresponding to the first augmented reality data stream;
identifying the gesture action to obtain an action parameter and an action parameter value corresponding to the gesture action in the corresponding area;
generating a user collaboration data stream based on the action parameters and the action parameter values.
9. An augmented reality based apparatus for remotely viewing a vehicle, the apparatus comprising:
the first receiving module is used for receiving the action parameters and the corresponding action parameter values which are sent by the remote equipment and used for remotely checking the vehicle;
the processing module is used for executing corresponding actions based on the action parameters and the action parameter values and collecting augmented reality data streams when the corresponding actions are executed on the vehicle;
a first sending module, configured to send the augmented reality data stream to the remote device.
10. An augmented reality based apparatus for remotely viewing a vehicle, the apparatus comprising:
the generating module is used for responding to the gesture action executed aiming at the picture corresponding to the first augmented reality data stream, and generating corresponding action parameters and action parameter values;
the second sending module is used for sending the action parameters and the action parameter values to field equipment;
the second receiving module is used for receiving the augmented reality data stream sent by the field device;
a display module for displaying the augmented reality data stream.
11. An electronic device, comprising: a processor, a memory and a computer program stored on and executable on the memory, the computer program when executed by the processor implementing the steps of the augmented reality based remote view vehicle method of any one of claims 1 to 8.
12. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, carries out the steps of the method for augmented reality based remote viewing of a vehicle according to any one of claims 1 to 8.
CN202111620206.0A 2021-12-27 2021-12-27 Method and device for remotely checking vehicle based on augmented reality Pending CN114415828A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111620206.0A CN114415828A (en) 2021-12-27 2021-12-27 Method and device for remotely checking vehicle based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111620206.0A CN114415828A (en) 2021-12-27 2021-12-27 Method and device for remotely checking vehicle based on augmented reality

Publications (1)

Publication Number Publication Date
CN114415828A true CN114415828A (en) 2022-04-29

Family

ID=81269413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111620206.0A Pending CN114415828A (en) 2021-12-27 2021-12-27 Method and device for remotely checking vehicle based on augmented reality

Country Status (1)

Country Link
CN (1) CN114415828A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114791766A (en) * 2022-05-16 2022-07-26 上海邻里邻外信息科技有限公司 AR device-based operation method, device, medium and device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105975232A (en) * 2016-05-06 2016-09-28 深圳市吾悦科技有限公司 Real-time interaction system and method for augmented reality
WO2017020132A1 (en) * 2015-08-04 2017-02-09 Yasrebi Seyed-Nima Augmented reality in vehicle platforms
CN110025952A (en) * 2019-04-10 2019-07-19 深圳真会玩网络游戏有限公司 Racer toy vehicles long-range control method, device and server
WO2020229841A1 (en) * 2019-05-15 2020-11-19 Roborace Limited A metaverse data fusion system
CN112509152A (en) * 2020-12-17 2021-03-16 重庆实唯信息技术有限公司 Car watching method, system, equipment and readable medium based on AR technology
CN112667139A (en) * 2020-12-11 2021-04-16 深圳市越疆科技有限公司 Robot operation method, device, equipment and storage medium based on augmented reality
CN112667179A (en) * 2020-12-18 2021-04-16 北京理工大学 Remote synchronous collaboration system based on mixed reality
US20210150236A1 (en) * 2019-11-18 2021-05-20 Lg Electronics Inc. Remote control method of the vehicle and a mixed reality device and a vehicle
US11079753B1 (en) * 2018-01-07 2021-08-03 Matthew Roy Self-driving vehicle with remote user supervision and temporary override
CN213938189U (en) * 2020-10-13 2021-08-10 中联重科股份有限公司 Non-blind area remote control system based on mixed reality technology and engineering vehicle

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017020132A1 (en) * 2015-08-04 2017-02-09 Yasrebi Seyed-Nima Augmented reality in vehicle platforms
CN105975232A (en) * 2016-05-06 2016-09-28 深圳市吾悦科技有限公司 Real-time interaction system and method for augmented reality
US11079753B1 (en) * 2018-01-07 2021-08-03 Matthew Roy Self-driving vehicle with remote user supervision and temporary override
CN110025952A (en) * 2019-04-10 2019-07-19 深圳真会玩网络游戏有限公司 Racer toy vehicles long-range control method, device and server
WO2020229841A1 (en) * 2019-05-15 2020-11-19 Roborace Limited A metaverse data fusion system
US20210150236A1 (en) * 2019-11-18 2021-05-20 Lg Electronics Inc. Remote control method of the vehicle and a mixed reality device and a vehicle
CN213938189U (en) * 2020-10-13 2021-08-10 中联重科股份有限公司 Non-blind area remote control system based on mixed reality technology and engineering vehicle
CN112667139A (en) * 2020-12-11 2021-04-16 深圳市越疆科技有限公司 Robot operation method, device, equipment and storage medium based on augmented reality
CN112509152A (en) * 2020-12-17 2021-03-16 重庆实唯信息技术有限公司 Car watching method, system, equipment and readable medium based on AR technology
CN112667179A (en) * 2020-12-18 2021-04-16 北京理工大学 Remote synchronous collaboration system based on mixed reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张燕翔 等: "舞台展演交互式空间增强现实技术", vol. 978, 31 August 2018, 合肥:中国科学技术大学出版社, pages: 12 - 17 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114791766A (en) * 2022-05-16 2022-07-26 上海邻里邻外信息科技有限公司 AR device-based operation method, device, medium and device

Similar Documents

Publication Publication Date Title
CN111556278B (en) Video processing method, video display device and storage medium
KR20220130197A (en) Filming method, apparatus, electronic equipment and storage medium
CN110662083A (en) Data processing method and device, electronic equipment and storage medium
CN106791893A (en) Net cast method and device
CN112199016B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109582122B (en) Augmented reality information providing method and device and electronic equipment
US9076345B2 (en) Apparatus and method for tutoring in convergence space of real and virtual environment
CN110162258B (en) Personalized scene image processing method and device
CN113194254A (en) Image shooting method and device, electronic equipment and storage medium
CN113259583B (en) Image processing method, device, terminal and storage medium
US20150244984A1 (en) Information processing method and device
CN113014960B (en) Method, device and storage medium for online video production
CN112783700A (en) Computer readable medium for network-based remote assistance system
CN114387445A (en) Object key point identification method and device, electronic equipment and storage medium
CN112887601B (en) Shooting method and device and electronic equipment
CN111435422B (en) Action recognition method, control method and device, electronic equipment and storage medium
CN108171222B (en) Real-time video classification method and device based on multi-stream neural network
CN104113682A (en) Image acquisition method and electronic equipment
CN114415828A (en) Method and device for remotely checking vehicle based on augmented reality
CN111045586B (en) Interface switching method based on three-dimensional scene, vehicle-mounted equipment and vehicle
CN112783316A (en) Augmented reality-based control method and apparatus, electronic device, and storage medium
CN102612205B (en) Method for controlling visual light sources, terminals and video conference system
CN112837372A (en) Data generation method and device, electronic equipment and storage medium
CN114371898B (en) Information display method, equipment, device and storage medium
CN114266305A (en) Object identification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination