CN111970456B - Shooting control method, device, equipment and storage medium - Google Patents

Shooting control method, device, equipment and storage medium Download PDF

Info

Publication number
CN111970456B
CN111970456B CN202010962809.8A CN202010962809A CN111970456B CN 111970456 B CN111970456 B CN 111970456B CN 202010962809 A CN202010962809 A CN 202010962809A CN 111970456 B CN111970456 B CN 111970456B
Authority
CN
China
Prior art keywords
shooting
control
instruction
head
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010962809.8A
Other languages
Chinese (zh)
Other versions
CN111970456A (en
Inventor
蒋燚
马标
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010962809.8A priority Critical patent/CN111970456B/en
Publication of CN111970456A publication Critical patent/CN111970456A/en
Application granted granted Critical
Publication of CN111970456B publication Critical patent/CN111970456B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Abstract

The embodiment of the application discloses a shooting control method, a shooting control device, shooting control equipment and a shooting control storage medium, and belongs to the field of human-computer interaction. The method comprises the following steps: a shooting view interface is displayed on the real environment picture in an overlapping mode, and an operation control in the shooting view interface is in a non-triggerable state; receiving a control operation acting on a shooting view interface; and determining a shooting control instruction corresponding to the control operation according to the operation type of the control operation, and executing the shooting control instruction. In the embodiment of the application, the operation control is in the triggerable state when the shooting view-finding interface is displayed, and the corresponding shooting control instruction is executed by judging the operation type of the control operation, so that the situation that a view-finding picture is changed when a user changes the equipment posture or focuses on the operation control in the human eye watching direction, the shot picture is not the picture which the user wants to shoot, and the shooting efficiency and accuracy of the head-mounted audio-visual equipment are improved.

Description

Shooting control method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of human-computer interaction, in particular to a shooting control method, a shooting control device, shooting control equipment and a storage medium.
Background
Augmented Reality (AR) technology is a technology for fusing Virtual content with a real world, and is applied to the real world after Virtual content such as characters, images, three-dimensional models, music, videos and the like generated by computer equipment is simulated, while Virtual Reality (VR) technology simulates Virtual environment and Virtual content according to data collected from a real environment. A user may experience a variety of operations on AR content or VR content with a head-mounted audiovisual device.
In the related art, when a user takes a picture or records a video through the head-mounted audio-visual device, the head-mounted audio-visual device is controlled to trigger an operation control in a shooting interface, so that the head-mounted audio-visual device shoots the video.
However, the user needs to change the device posture of the head-mounted audio-visual device, and the focus position is moved to the icon of the operation control, so as to trigger the operation control, and changing the device posture may cause the view-finding picture of the head-mounted audio-visual device to change, and the finally captured picture may not be the picture that the user wants to capture.
Disclosure of Invention
The embodiment of the application provides a shooting control method, a shooting control device, shooting control equipment and a storage medium. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a shooting control method, where the method is used for a head-mounted audio-visual device, and the method includes:
a shooting view interface is displayed on a real environment picture in an overlapping mode, and an operation control in the shooting view interface is in a non-triggerable state;
receiving a control operation acting on the shooting and viewing interface, wherein the control operation is triggered through a touch area of the head-mounted audio-visual equipment or a control device connected with the head-mounted audio-visual equipment;
and determining a shooting control instruction corresponding to the control operation according to the operation type of the control operation, and executing the shooting control instruction.
On the other hand, an embodiment of the present application provides a shooting control apparatus, including:
the first display module is used for displaying a shooting and viewing interface in an overlapping mode on a real environment picture, and an operation control in the shooting and viewing interface is in a non-triggerable state;
the receiving module is used for receiving control operation acting on the shooting and viewing interface, and the control operation is triggered through a touch area of the head-mounted audio-visual equipment or control equipment connected with the head-mounted audio-visual equipment;
and the determining module is used for determining a shooting control instruction corresponding to the control operation according to the operation type of the control operation and executing the shooting control instruction.
In another aspect, an embodiment of the present application provides a head-mounted audio-visual device, which includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or a set of instructions, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the shooting control method according to the above aspect.
In another aspect, an embodiment of the present application provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the shooting control method according to the above aspect.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the head mounted audio-visual device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the head mounted audio-visual device executes the shooting control method provided in the various alternative implementations of the above-described aspect.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
in the embodiment of the application, the operation control is in the triggerable state when the shooting view-finding interface is displayed, and the corresponding shooting control instruction is executed by judging the operation type of the control operation, so that the condition that when a user controls the head-mounted audio-visual equipment to shoot by triggering the operation control, the view-finding picture is changed while the equipment posture or the focusing direction of human eyes is changed to focus the operation control, so that the shot picture is not the picture which the user wants to shoot is avoided, and the shooting efficiency and accuracy of the head-mounted audio-visual equipment are improved.
Drawings
FIG. 1 is a schematic diagram of a head mounted audiovisual device provided in an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of a head mounted audiovisual device provided in another exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a head mounted audiovisual device and control device provided in an exemplary embodiment of the present application;
fig. 4 is a flowchart of a photographing control method according to an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a touch area provided in an exemplary embodiment of the present application;
fig. 6 is a flowchart of a photographing control method according to another exemplary embodiment of the present application;
FIG. 7 is a diagram illustrating a switching viewfinder provided by an exemplary embodiment of the present application;
FIG. 8 is a schematic diagram of a photographic viewing interface provided by an exemplary embodiment of the present application;
fig. 9 is a block diagram of a configuration of a photographing control apparatus according to an exemplary embodiment of the present application;
fig. 10 is a block diagram of a head-mounted audio-visual device according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
In one possible implementation, the head-mounted audiovisual device is an AR device, a VR device, or an AR and VR integrated audiovisual device.
When the head-mounted audio-visual device displays multimedia content by using AR technology, the display principle can be roughly divided into three types:
a head-wearing audio-visual device provided with a display screen and a camera collects real environment pictures around through the camera, then superimposes virtual information on the real environment pictures, and displays the superimposed pictures through the display screen.
A head-mounted audio-visual device provided with a projection assembly and a transparent lens projects virtual information onto the transparent lens through the projection assembly, so that a user can observe a real environment and the virtual information through the transparent lens at the same time, and experience of editing the virtual information in the real environment is obtained.
The projection component of the head-mounted audio-visual equipment is arranged on the inner side of the equipment, and the virtual information can be directly projected to eyeballs of a user through the projection component, so that the user can obtain the use feeling of editing the virtual information in a real environment. The virtual information includes text, models, web pages, multimedia content (e.g., virtual images, video, audio), and the like.
Fig. 1 shows a Head-Mounted audio-visual device 110, where the device 110 is a Head-Mounted Display (HMD) device, the Head-Mounted audio-visual device 110 collects real-time environment pictures through a camera 111, superimposes virtual information on the real-time environment pictures, displays the superimposed pictures through a Display screen 112, and a user wears the Head-Mounted audio-visual device 110 on his Head to observe a scene where the virtual information and the real-time environment pictures are merged through the Display screen 112. Fig. 2 shows another head-mounted audio-visual device 210, where the device 210 is a glasses-type device, a projection component 211 is disposed outside a lens of the head-mounted audio-visual device 210, the head-mounted audio-visual device 210 projects virtual information to a lens 212 through the projection component 211, and a user can observe a real environment picture and the virtual information through the lens 212 after wearing the head-mounted audio-visual device 210.
The present application describes a head-mounted audio-visual device as an example of a head-mounted audio-visual device provided with a display screen and a camera. As shown in fig. 3, the head-mounted audiovisual device 310 is provided with a camera assembly 311 and a display screen assembly 312, and the camera assembly 311 captures real-time images of the surrounding real environment, and after the real environment images are fused with the AR information, the real environment images are displayed inside the head-mounted audiovisual device 310 through the display screen assembly 312. In one possible implementation manner, a touch area is provided in the head-mounted audiovisual device 310, for example, a bar-shaped touch area at a temple of the AR glasses, and the head-mounted audiovisual device 310 has a shooting function, and the user adjusts the framing content by changing the device posture of the head-mounted audiovisual device 310 and controls the head-mounted audiovisual device 310 to execute a corresponding instruction in conjunction with a touch operation on the touch area.
In one possible embodiment, the head-mounted audio visual device 310 may be used alone for shooting or in conjunction with the control device 320.
The control device 320 is connected to the head-mounted audiovisual device 310, and the types of devices include: at least one of a handle, a smart phone and a tablet computer. At least one of a touch area and a touch button is provided in the control device 320, and the user adjusts the framing content by changing the device posture of the control device 320 and controls the head-mounted audio-visual device 310 to execute a corresponding instruction in combination with a touch operation on the touch area. In one possible implementation, when the control device 320 is connected with the head mounted audiovisual device 310, the head mounted audiovisual device 310 synchronously receives control operations that act on the touch area of the control device 320.
In one possible embodiment, the head mounted audio visual device 310 and the control device 320 may be connected via a data cable, a Wireless Fidelity (WiFi) hotspot, or bluetooth. The user can choose to wear the head-mounted audio-visual device 310 alone for shooting, or can wear the head-mounted audio-visual device 310 and control the control device 320 for shooting.
Fig. 4 shows a flowchart of a photographing control method according to an exemplary embodiment of the present application. The present embodiment is described by taking the method as an example for the head-mounted audiovisual device 310 shown in fig. 3, and the method includes the following steps:
step 401, a shooting and viewing interface is displayed on the real environment picture in an overlapping manner, and an operation control in the shooting and viewing interface is in an untriggerable state.
After the head-mounted audio-visual equipment is started, real environment pictures are collected in real time, and virtual information needing to be displayed is determined according to user input. In the embodiment of the application, the head-mounted audio-visual device runs a camera application, and the virtual information is a shooting view interface.
In one possible implementation, the head-mounted audio-visual device captures a real environment picture right in front of the device through the camera assembly, and the shooting and viewing interface is fused with the real environment picture and then displayed through the display screen assembly, for example, the display screen assembly is located in front of the head-mounted audio-visual device, so that the user can observe the shooting and viewing interface right in front of the head-mounted audio-visual device after wearing the head-mounted audio-visual device.
When a user wears and turns on the head-mounted audio-visual device, the head-mounted audio-visual device determines a focusing position according to an intersection point of a straight line perpendicular to the lens and a vertical plane in front of the user, or determines the focusing position according to an intersection point of a ray (such as an infrared ray) emitted by the control device and the vertical plane in front of the user, and the user changes the device posture of the head-mounted audio-visual device by rotating the head or waves the control device to change the device posture of the control device, so that the focusing position of the head-mounted audio-visual device is changed; or the head-mounted audio-visual equipment has an eyeball identification function, and the user changes the focusing position of the head-mounted audio-visual equipment by changing the watching direction of human eyes. The head-mounted audio-visual equipment determines shooting contents according to the focusing position.
However, the trigger operation on the operation control also needs to move the focus position to the control icon, so to avoid that the user changes the framing content of the head-mounted audio-visual device in the process of shooting by triggering the operation control, the operation control in the shooting framing interface is in an untriggable state, and the head-mounted audio-visual device executes a corresponding instruction by the trigger operation of the control area. In order to facilitate a user to grasp a method for controlling the head-mounted audio-visual device to shoot, the head-mounted audio-visual device displays the icon of the operation control in a non-triggerable state in a predetermined manner, for example, displays the icon in gray, or adds a non-touch mark, and the like, and displays the prompt information of shooting control operation at a predetermined position of the shooting view interface, so as to prompt the user to control the head-mounted audio-visual device to shoot through the touch area.
Step 402, receiving a control operation acting on a shooting and viewing interface, wherein the control operation is triggered through a touch area of the head-mounted audio-visual device or a control device connected with the head-mounted audio-visual device.
The head-mounted audio-visual equipment is provided with a touch area for receiving control operation of a user. In a possible implementation manner, limited by the shape and size of the head-mounted audio-visual device, the area of the touch area is small, so as to facilitate the control operation of the user, the head-mounted audio-visual device is further connected with a control device, the area of the touch area in the touch device is large, and the user can select the touch area of the head-mounted audio-visual device or the touch area of the control device as required to trigger the control operation.
Referring to fig. 5, a schematic diagram of a trigger control operation is shown. As shown in the figure, the head-mounted audio visual device 501 is an AR glasses, a rectangular touch area 502 is disposed on the outer side of the left side.
And step 403, determining a shooting control instruction corresponding to the control operation according to the operation type of the control operation, and executing the shooting control instruction.
Since various functions such as photographing, starting recording, stopping recording and the like are set in the camera application, different functions correspond to different photographing control instructions, in order to distinguish the photographing control instructions corresponding to different control operations, developers preset corresponding control operations such as sliding operation, pressing operation, clicking operation and the like for each photographing control instruction, and the head-mounted audio-visual equipment determines and executes the photographing control instruction corresponding to the control operation according to the operation type of the control operation.
In summary, in the embodiment of the application, when the shooting view-finding interface is displayed, the operation control is in the non-triggerable state, and the corresponding shooting control instruction is executed by judging the operation type of the control operation, so that the situation that when a user controls the head-mounted audio-visual device to shoot by triggering the operation control, the view-finding picture is changed while the device posture or the focusing direction of human eyes is changed to focus the operation control, and the shot picture is not the picture which the user wants to shoot is avoided, and the shooting efficiency and accuracy of the head-mounted audio-visual device are improved.
In one possible implementation, the head-mounted audio-visual device determines and executes the corresponding shooting control instruction according to the operation type of the control operation, and the characteristics corresponding to different operation types are different, such as the operation duration, the operation times, the moving distance and the moving direction, and the like. Referring to fig. 6, a flowchart of a photographing control method according to an exemplary embodiment of the present application is shown. The present embodiment is described by taking the method as an example for the head-mounted audiovisual device 310 shown in fig. 3, and the method includes the following steps:
step 601, a shooting and viewing interface is displayed on the real environment picture in an overlapping mode, and an operation control in the shooting and viewing interface is in an untriggerable state.
Step 602, receiving a control operation applied to the shooting and viewing interface, where the control operation is triggered by a touch area of the head-mounted audio-visual device or a control device connected to the head-mounted audio-visual device.
For specific implementation of steps 601 and 602, reference may be made to steps 401 and 402, which are not described herein again in this embodiment of the present application.
Since different operation types correspond to different shooting control instructions, after receiving the control operation, the head-mounted audio-visual device performs the following steps 603 to 606, or steps 607 to 609, or step 610:
step 603, in response to the operation type being the long press operation, determining that the shooting control instruction is a view frame switching instruction, and the press duration of the long press operation is greater than a duration threshold.
The viewfinder is used for framing shooting contents in the shooting viewfinder interface and is equivalent to a viewfinder in a real camera. In other types of electronic devices, such as smartphones and tablet computers, a user switches a view finder frame by changing the posture of the electronic device or triggering an operation control, and both methods cause the view finding content in the view finder frame to change when shooting with a head-mounted audio-visual device, so that a developer sets the control operation type corresponding to a view finder switching instruction as a long press operation.
Illustratively, the time threshold of the pressing time is 2 seconds, and when the head-mounted audio-visual device receives a pressing operation acting on the touch area of the head-mounted audio-visual device or the touch area of the control device, and the pressing time is longer than 2 seconds, it is determined that a viewfinder frame switching instruction is received.
And step 604, sequentially switching the current view frame into the candidate view frames according to the priority of the candidate view frames according to the pressing duration of the long-press operation.
Wherein the candidate frame is different from the frame information of the current frame, and the frame information includes at least one of a frame shape, an aspect ratio, a frame size, and a viewing area.
In a possible implementation manner, the head-mounted audio-visual device stores multiple frames, and can switch to a required frame for one operation of a user, and the head-mounted electronic device sequentially switches the current frame to a candidate frame according to the priority of the candidate frame according to the pressing duration of the long-press operation. Namely, a switching time interval is set in the head-mounted electronic device, and when the pressing duration reaches the switching time interval, the head-mounted electronic device executes a frame switching instruction once to switch the current frame to the candidate frame.
In order to further improve the efficiency of switching the viewfinding frames and reduce the operation time of switching the viewfinding frames by a user, the head-mounted audio-visual equipment determines the priority of the candidate viewfinding frames according to the viewfinding content and sequentially switches the current viewfinding frame into the candidate viewfinding frames according to the sequence of the priority from high to low.
If the number of the candidate viewfinding frames is 1, the head-mounted audio-visual equipment does not need to determine the priority of the candidate viewfinding frames, directly switches the current viewfinding frame into the candidate viewfinding frame, and switches between the two viewfinding frames according to the pressing duration of the long-press operation.
In another possible implementation manner, a long press operation by the user can only enable the head-mounted audio-visual device to perform a frame switching operation once, that is, when the head-mounted audio-visual device receives the long press operation and the press duration is greater than the duration threshold, a target frame is determined from the candidate frames according to the framing content, and the current frame is switched to the target frame.
Schematically, as shown in fig. 7, the head-mounted audio-visual device displays a shooting view interface 701, wherein the operation control 702 is in a non-touch state, the default view frame is the same as the shooting view interface 701 in size, when a long-press operation is received and the press duration reaches the switching time interval, the head-mounted audio-visual device switches the default view frame to the view frame 703, the aspect ratio and the size of the view frame 703 are different from those of the default view frame, and in order to prompt the user of the switched current view frame, the head-mounted audio-visual device displays a layer of mask 704 with preset color, transparency, or gray scale outside the view frame 703 in the shooting view interface 701.
In one possible implementation, the head-mounted audio-visual device determines the priority of each candidate frame according to the framing content, and step 604 further includes the following steps:
step 604a, in response to the AR content being included in the framing content, acquiring a display state of the AR content, the display state including at least one of a shape, a size, and a display position.
When the AR content is included in the finder content, the user generally needs to capture the AR content, and therefore the head-mounted audio-visual device needs to locate the AR content in the finder frame without changing the device posture or the gaze direction of the human eye, and therefore the head-mounted audio-visual device determines the priority of the candidate finder frame according to the display state of the AR content.
And step 604b, determining the priority of the candidate viewing frames according to the display state, wherein the priority of the candidate viewing frames meeting the AR viewing conditions is higher than that of the candidate viewing frames not meeting the AR viewing conditions, and the AR viewing conditions comprise at least one of the conditions that the display position is located in the viewing area, the size of the AR content is smaller than that of the viewing frames, and the shape of the AR content is matched with that of the viewing frames.
And when the long-press operation is received, the head-mounted audio-visual equipment sequentially switches the candidate viewing frames according to the priority from high to low until the long-press operation is finished, and the head-mounted audio-visual equipment stops switching.
In order to locate the AR content within the range of the finder frame, the AR finder condition includes at least one of a display position located in the finder area, a size of the AR content smaller than the finder frame size, and a shape of the AR content matching the finder frame shape. For example, if the AR viewing conditions include the above three types, the head mounted audio visual device determines the priority of the candidate frames according to the number of conditions that the candidate frames satisfy, that is, the candidate frame satisfying the three types of AR viewing conditions has the highest priority and is the first-order candidate frame, the second-order candidate frame is the candidate frame satisfying the two types of AR viewing conditions, the third-order candidate frame is the candidate frame satisfying one type of AR viewing conditions, and the fourth-order candidate frame is the candidate frame not satisfying any one type of AR viewing conditions.
In one possible implementation, if the viewing content does not include AR content, the head-mounted audio visual device determines the priority of the candidate frames according to the aspect ratio of the current frame, for example, the priority of the candidate frames is determined from high to low in the order of similarity of the aspect ratio from high to low.
For example, if the aspect ratio of the current frame is 4:3, the aspect ratio of the candidate frame a is 3:2, and the aspect ratio of the candidate frame B is 14:9, it is determined that the candidate frame a has a higher priority than the candidate frame B.
Step 605, displaying a frame list in the shooting view interface, wherein the frame list comprises thumbnails of the candidate frames.
When there are a plurality of kinds of candidate frames, the user may not know the kind and priority of the candidate frames, and thus cannot determine the pressing time period corresponding to the desired frame. In a possible implementation manner, after the long-press operation is received and the press duration reaches the duration threshold, in order to facilitate the user to quickly switch the view finder subsequently, the head-mounted audio-visual device displays a view finder list in the shooting view finder interface, and the user quickly switches to the corresponding candidate view finder through a predetermined operation according to the display position of the required candidate view finder in the view finder list.
When the head-mounted audio-visual equipment receives the long-press operation and the press time length reaches the time length threshold value and detects that the operation type of the control operation is changed from the press operation to a second sliding operation in the preset direction, a view frame list is displayed, and the sliding operation and the press operation are connected. In one possible embodiment, when it is detected that the operation type of the control operation is changed from the press operation to the second slide operation, the head mounted audio-visual device displays the finder frame list while stopping automatically switching the finder frame.
Or the head-mounted audio-visual equipment immediately displays the view frame list when the long-press operation is received and the press duration reaches the duration threshold.
Illustratively, the head mounted audio-visual device arranges and displays the thumbnails of the candidate frames in the frame list in order of priority from high to low.
And step 606, responding to the second sliding operation acting on the shooting view interface, and switching the current view frame into the candidate view frame in the view frame list according to the sliding direction and the sliding distance of the second sliding operation.
When the shooting view interface displays a view frame list and receives a second sliding operation, the head-mounted audio-visual device switches the current view frame into a candidate view frame in the view frame list according to the sliding direction and the sliding distance of the second sliding operation. The developer sets in advance a correspondence between the sliding direction of the second sliding operation and the order of switching the candidate frames, for example, thumbnails of the candidate frames in the frame list are arranged in a row in the order of priority, and when the second sliding operation is an upward sliding operation, the head mounted audio visual device switches the candidate frames from bottom to top in the order in the row sequence corresponding to the thumbnails.
Illustratively, as shown in fig. 8, a frame list 802 is displayed in the shooting view interface 801, a thumbnail of the current frame 802a is displayed at the top of the frame list 802, and when receiving a second sliding operation in which the sliding direction is from top to bottom, the head mounted audio visual device performs frame switching at intervals of a second switching distance in the order of the candidate frame 802b, the candidate frame 802c, and the other candidate frames below.
In a possible implementation manner, the shooting control method provided by the embodiment of the present application further includes the following steps:
and in response to the adjustment operation on the equipment posture of the head-mounted audio-visual equipment, or the equipment posture of the control equipment, or the human eye gazing direction, updating the framing content of the current framing frame according to the adjusted equipment posture or the human eye gazing direction.
Optionally, the head-mounted audio-visual device determines the framing content according to the focusing position, and the user changes the device posture of the head-mounted audio-visual device by rotating the head, so as to change the focusing position of the head-mounted audio-visual device; or changing the device attitude of the control device by waving the control device, thereby changing the focus position of the head-mounted audio-visual device; or the head-mounted audio-visual equipment has an eyeball identification function, and the user changes the focusing position of the head-mounted audio-visual equipment by changing the watching direction of the eyes, so that the head-mounted audio-visual equipment updates the framing content of the current framing frame.
Step 607, in response to the operation type being the first sliding operation, determining that the shooting control instruction is the function switching instruction, and the sliding distance of the first sliding operation is greater than the distance threshold.
Since different functions, such as photographing, video recording, and AR photographing, are usually set in the camera function, and there is a need for switching the functions for the user, the developer sets the first sliding operation as a control operation corresponding to the function switching instruction. When the head-mounted audio-visual device receives a first sliding operation, determining that the shooting control instruction is a function switching instruction, and in order to avoid sliding a finger in a touch area during other operations by a user, so that the head-mounted audio-visual device erroneously recognizes the operation as a control operation corresponding to the function switching instruction, the sliding distance of the first sliding operation is greater than a distance threshold, and the sliding direction is a preset direction, for example, the sliding distance needs to be greater than 1cm, and the sliding direction is a front-back direction in the touch area of the head-mounted audio-visual device or a left-right direction in the touch area of the control device.
In order to prompt the user of the type of the current shooting function, after the head-mounted audio-visual device executes the function switching instruction, the control icon of the current shooting function is specially displayed in the shooting view interface, for example, the control icon is highlighted, a special mark is added, or a special color is displayed.
Step 608, determining the target shooting function according to the current shooting function and the sliding direction of the first sliding operation.
The developer sets in advance a correspondence between the sliding direction of the first sliding operation and the function switching order. In one possible embodiment, when there are three or more shooting functions, the head mounted audio visual device determines the target shooting function according to the sliding direction and the sliding distance of the first sliding operation, and when the sliding distance of the first sliding operation reaches the first switching distance interval, the head mounted audio visual device executes the function switching instruction once.
Illustratively, as shown in fig. 7, the camera application of the head-mounted audio-visual device includes three functions of photographing, recording, and AR camera, icons of the three functions are distributed and displayed from left to right in the photographing and viewing interface 701, if the sliding direction of the first sliding operation in the touch area of the head-mounted audio-visual device is from back to front, or the sliding direction in the touch area of the control device is from left to right, the head-mounted audio-visual device determines the target photographing function according to the sequence of the sequential arrangement of photographing, recording, and AR camera according to the current photographing function, for example, when the current photographing function is recording, the target photographing function is determined to be the AR camera.
And step 609, switching the current shooting function to the target shooting function.
Optionally, the user performs a first sliding operation once, and the head-mounted audio-visual device switches the shooting function once; or, the head mounted audio-visual device switches the shooting function in real time according to the sliding distance and the sliding direction of the first sliding operation, for example, after the shooting function is switched to the video recording function according to the sliding direction of the first sliding operation, the user slides in the reverse direction, the head mounted audio-visual device switches the video recording function to the shooting function again, for example, when the head mounted audio-visual device determines that the sliding distance of the first sliding operation reaches 1cm, the shooting function is switched to the video recording function, and when the sliding distance of the first sliding operation reaches 2cm, the video recording function is switched to the AR shooting function.
And step 610, in response to the fact that the operation type is the continuous clicking operation, determining that the shooting control instruction is a shooting instruction or a shooting stopping instruction, and executing the shooting control instruction, wherein the number of clicks of the continuous clicking operation is larger than a number threshold.
Since the click operation is similar to the action when the real camera is used for shooting, in order to facilitate the user to quickly grasp the control operation corresponding to the shooting instruction and the shooting stop instruction, the developer sets the continuous click operation as the shooting control instruction or the shooting stop instruction. Preferably, the threshold value of the number of clicks is 1, and when the number of clicks of the continuous click operation is greater than 1, the head-mounted audio-visual device determines that the shooting control instruction is a shooting instruction or a shooting stop instruction.
Optionally, the number of clicks of the continuous click operation corresponding to the shooting instruction or the shooting stopping instruction is the same, for example, both the continuous click operation and the double click operation are performed; or, in order to further distinguish the shooting command from the shooting stop command, the corresponding number of clicks is different, for example, the number of clicks corresponding to the shooting command is 2, and the number of clicks corresponding to the shooting stop command is 3, which is not limited in the embodiment of the present application.
In one possible embodiment, the head-mounted audio-visual device determines the shooting control instruction according to the current shooting function and the shooting status, and step 610 further includes the following steps:
step 610a, in response to the current shooting function being the shooting function, or the current shooting function being the video recording function and the head-mounted audio-visual device being in the non-video recording state, determining the shooting control instruction to be the shooting instruction.
Because the head-mounted audio-visual equipment takes a picture as an instant operation and does not need to start or stop an instruction, if the current shooting function is the shooting function, the head-mounted audio-visual equipment determines the shooting control instruction as the shooting instruction and executes the shooting instruction once to generate an image containing the current framing content.
The video recording function needs a certain time length, and the time length is controlled by a user, so the control operation of starting and stopping the video recording needs to be distinguished, if the current shooting function is the video recording function, the head-mounted audio-visual equipment is in a non-video recording state, and the continuous click operation with the click frequency larger than the frequency threshold value is received, the user needs to start the video recording, and the head-mounted audio-visual equipment determines that the shooting control instruction is the shooting instruction and starts the video recording.
Step 610b, in response to the current shooting function being a video recording function and the head mounted audio visual device being in a video recording state, determining the shooting control instruction to be a shooting stop instruction.
And if the current shooting function is a video recording function, the head-mounted audio-visual equipment is in a video recording state, and continuous click operation with the click times larger than the time threshold value is received, determining that the user needs to stop recording, determining that the shooting control instruction is a shooting stopping instruction by the head-mounted audio-visual equipment, stopping recording, and generating a video recording video.
In the embodiment of the application, the corresponding shooting control instruction is determined and executed by judging the operation type of the control operation, and the user does not need to change the equipment posture to trigger the operation control, so that the shooting efficiency by using the head-mounted audio-visual equipment is improved; in addition, when the long press operation is received, the priority of the candidate view frames is determined according to the view content, so that the view frames are switched according to the priority, a user can conveniently and quickly switch to the required view frames, and the view frame switching efficiency is improved.
Fig. 9 is a block diagram of a shooting control apparatus according to an exemplary embodiment of the present application, the apparatus including:
a first display module 901, configured to display a shooting and viewing interface in an overlay manner on a real environment picture, where an operation control in the shooting and viewing interface is in a non-triggerable state;
a receiving module 902, configured to receive a control operation acting on the shooting and viewing interface, where the control operation is triggered by a touch area of the head-mounted audio-visual device or a control device connected to the head-mounted audio-visual device;
a determining module 903, configured to determine, according to the operation type of the control operation, a shooting control instruction corresponding to the control operation, and execute the shooting control instruction.
Optionally, the determining module 903 includes:
a first determining unit, configured to determine that the shooting control instruction is a view finder switching instruction in response to that the operation type is a long press operation, where a press duration of the long press operation is greater than a duration threshold;
a second determining unit, configured to determine that the shooting control instruction is a function switching instruction in response to that the operation type is a first sliding operation, where a sliding distance of the first sliding operation is greater than a distance threshold;
and the third determining unit is used for responding to the fact that the operation type is continuous clicking operation, determining that the shooting control instruction is a shooting instruction or a shooting stopping instruction, and the number of clicks of the continuous clicking operation is larger than a number threshold.
Optionally, the shooting control instruction is the viewfinder frame switching instruction;
the determining module 903 includes:
and the first switching unit is used for sequentially switching the current view frame into the candidate view frames according to the pressing duration of the length pressing operation and the priority of the candidate view frames, wherein the candidate view frames are different from the view frame information of the current view frame, and the view frame information comprises at least one of the shape, the aspect ratio, the size and the view area of the view frame.
Optionally, the first switching unit is further configured to:
in response to the inclusion of the AR content in the viewfinder content, obtaining a display state of the AR content, the display state including at least one of a shape, a size, and a display position;
determining, according to the display state, a priority of the candidate frame, the priority of the candidate frame satisfying an AR viewing condition being higher than the priority of the candidate frame not satisfying the AR viewing condition, the AR viewing condition including at least one of the display position being located in the viewing area, the size of the AR content being smaller than the frame size, and the shape of the AR content matching the frame shape.
Optionally, the apparatus further comprises:
the second display module is used for displaying a view frame list in the shooting view frame interface, and the view frame list comprises thumbnails of the candidate view frames;
and the switching module is used for responding to a second sliding operation acted on the shooting and viewing interface and switching the current viewing frame into the candidate viewing frame in the viewing frame list according to the sliding direction and the sliding distance of the second sliding operation.
Optionally, the apparatus further comprises:
and the updating module is used for responding to the adjustment operation of the equipment posture of the head-mounted audio-visual equipment, the equipment posture of the control equipment or the human eye watching direction and updating the framing content of the current framing frame according to the adjusted equipment posture or the human eye watching direction.
Optionally, the determining module 903 includes:
a fourth determination unit configured to determine a target shooting function according to the current shooting function and the sliding direction of the first sliding operation;
and the second switching unit is used for switching the current shooting function to the target shooting function.
Optionally, the third determining unit is further configured to:
in response to that the current shooting function is a shooting function, or the current shooting function is a video recording function and the head-mounted audio-visual equipment is in a non-video recording state, determining the shooting control instruction as the shooting instruction;
and determining the shooting control instruction as the shooting stop instruction in response to the current shooting function being the video recording function and the head-mounted audio-visual equipment being in a video recording state.
In summary, in the embodiment of the application, when the shooting view-finding interface is displayed, the operation control is in the non-triggerable state, and the corresponding shooting control instruction is executed by judging the operation type of the control operation, so that the situation that when a user controls the head-mounted audio-visual device to shoot by triggering the operation control, the view-finding picture is changed while the device posture or the focusing direction of human eyes is changed to focus the operation control, and the shot picture is not the picture which the user wants to shoot is avoided, and the shooting efficiency and accuracy of the head-mounted audio-visual device are improved.
As shown in fig. 10, an embodiment of the present application provides a head-mountable audiovisual device 1000, where the head-mountable audiovisual device 1000 may include one or more of the following components: a processor 1001, a memory 1002, a power component 1003, a multimedia component 1004, an audio component 1005, an Input/Output (I/O) interface 1006, a sensor component 1007, and a communication component 1008.
The processor 1001 generally controls the overall operation of the head mounted audio visual device, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. Processor 1001 may include one or more processing cores. Processor 1001 interfaces various parts within the overall device 1000 using various interfaces and circuitry, and performs various functions of terminal 1000 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in memory 1002, and invoking data stored in memory 1002. Alternatively, the processor 1001 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1001 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 1001, but may be implemented by a communication chip.
The memory 1002 is configured to store various types of data to support operation at the head mounted audiovisual device. Examples of such data include instructions, models, contact data, phonebook data, messages, images, videos, etc. for any application or method operating on the head mounted audiovisual device. The Memory 1002 may include a Random Access Memory (RAM) or a Read-Only Memory (ROM). Optionally, the memory 1002 includes a non-transitory computer-readable medium. The memory 1002 may be used to store instructions, programs, code sets, or instruction sets. The memory 1002 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, and the like), instructions for implementing the above method embodiments, and the like, and the operating system may be an Android (Android) system (including a system based on Android system depth development), an IOS system developed by apple inc (including a system based on IOS system depth development), or other systems. The stored data area can also store data created by terminal 1000 in use (e.g., phonebook, audio-video data, chat log data), and the like.
The power supply component 1003 provides power to the various components of the head mounted audiovisual device 1000. The power components 1003 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the head-mounted audiovisual device 1000.
The multimedia component 1004 includes a screen that provides an output interface between the head mounted audiovisual device 1000 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1004 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the head mounted audio visual device 1000 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1005 is configured to output and/or input audio signals. For example, the audio component 1005 includes a Microphone (MIC) configured to receive external audio signals when the head-mounted audio visual device 1000 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1002 or transmitted via the communication component 1008. In some embodiments, audio component 1005 also includes a speaker for outputting audio signals.
The I/O interface 1006 provides an interface between the processor 1001 and peripheral interface modules, such as a keyboard, click wheel, buttons, touch pad, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1007 includes one or more sensors for providing various aspects of status assessment for the head-mounted audiovisual device 1000. For example, the sensor assembly 1007 may detect the open/closed state of the head mounted audio visual device 1000, the relative positioning of components, such as the display screen and keypad of the head mounted audio visual device 1000, the sensor assembly 1007 may also detect a change in the position of the head mounted audio visual device 1000 or a component of the head mounted audio visual device 1000, the presence or absence of user contact with the head mounted audio visual device 1000, the orientation or acceleration/deceleration of the head mounted audio visual device 1000, and a change in the temperature of the head mounted audio visual device 1000. The sensor assembly 1007 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1007 may also include a light sensor for use in imaging applications. In some embodiments, the sensor assembly 1007 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor. For example, the head-mounted audio-visual device 1000 determines the operation type of the control operation by the pressure sensor.
The communication component 1008 is configured to facilitate communication between the head-mounted audiovisual device 1000 and other devices (e.g., control devices) in a wired or wireless manner. The head-mounted audiovisual device 1000 may access a wireless network based on a communication standard. In an exemplary embodiment, the communication component 1008 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the Communication component 1008 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wide Band (UWB) technology, BlueTooth (BlueTooth, BT) technology, and other technologies. The head-mounted audio/visual device 1000 synchronously receives information sent by the control device, for example, a touch operation applied to the touch area received by the control device, through the communication component 1008.
In addition, those skilled in the art will appreciate that the configuration of the apparatus 1000 illustrated in the above-described figures does not constitute a limitation of the apparatus 1000, and that the apparatus may include more or less components than those illustrated, or some components may be combined, or a different arrangement of components.
The embodiment of the present application further provides a computer-readable storage medium, where at least one instruction is stored, and the at least one instruction is loaded and executed by a processor to implement the shooting control method according to the above embodiments.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the head mounted audio-visual device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the head mounted audio-visual device executes the shooting control method provided in the various alternative implementations of the above-described aspect.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium. Computer-readable storage media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (11)

1. A shooting control method for a head-mounted audio-visual apparatus, the method comprising:
a shooting view interface is displayed on a real environment picture in an overlapping mode, and an operation control in the shooting view interface is in a non-triggerable state;
receiving a control operation acting on the shooting and viewing interface, wherein the control operation is triggered through a touch area of the head-mounted audio-visual equipment or a control device connected with the head-mounted audio-visual equipment;
and determining a shooting control instruction corresponding to the control operation according to the operation type of the control operation, and executing the shooting control instruction, wherein the shooting control instruction comprises a framing frame switching instruction, a function switching instruction, a shooting instruction and a shooting stopping instruction, the framing frame switching instruction is used for instructing the head-mounted audio-visual equipment to determine the priority of a candidate framing frame based on the display state of the augmented reality AR content in the framing content, and switching the framing frame according to the operation duration of the control operation and the priority, and the display state comprises at least one of shape, size and display position.
2. The method according to claim 1, wherein the determining, according to the operation type of the control operation, a shooting control instruction corresponding to the control operation and executing the shooting control instruction comprises:
responding to the long-press operation type, and determining that the shooting control instruction is the view-finding frame switching instruction, wherein the pressing duration of the long-press operation is greater than a duration threshold;
in response to the operation type being a first sliding operation, determining that the shooting control instruction is the function switching instruction, wherein a sliding distance of the first sliding operation is greater than a distance threshold;
and in response to the fact that the operation type is continuous clicking operation, determining that the shooting control instruction is the shooting instruction or a shooting stopping instruction, wherein the number of clicks of the continuous clicking operation is larger than a number threshold.
3. The method according to claim 2, wherein the photographing control instruction is the finder frame switching instruction;
the executing the shooting control instruction includes:
and sequentially switching the current view frame into the candidate view frames according to the pressing duration of the long press operation and the priority of the candidate view frames, wherein the candidate view frames are different from the view frame information of the current view frame, and the view frame information comprises at least one of the shape, the aspect ratio, the size and the view area of the view frame.
4. The method according to claim 3, wherein sequentially switching the current frame to the candidate frames according to the priority of the candidate frames comprises:
in response to the AR content being included in the framing content, obtaining the display state of the AR content;
determining, according to the display state, a priority of the candidate frame, the priority of the candidate frame satisfying an AR framing condition being higher than the priority of the candidate frame not satisfying the AR framing condition, the AR framing condition including at least one of the display position being located in the framing area, the size of the AR content being smaller than the framing frame size, and the shape of the AR content matching the framing frame shape.
5. The method according to claim 3, wherein after the current frame is sequentially switched to the candidate frames according to the priority of the candidate frames according to the pressing duration of the long press operation, the method further comprises:
displaying a frame list in the shooting view interface, wherein the frame list comprises thumbnails of the candidate frames;
and responding to a second sliding operation acting on the shooting and framing interface, and switching the current framing frame into the candidate framing frame in the framing frame list according to the sliding direction and the sliding distance of the second sliding operation.
6. The method according to any one of claims 3 to 5, wherein the content of the current frame is determined based on the device posture of the head-mounted audio-visual device, or the device posture of the control device, or the focusing position corresponding to the human eye gazing direction;
the method further comprises the following steps:
and in response to the adjustment operation on the equipment posture of the head-mounted audio-visual equipment, or the equipment posture of the control equipment, or the human eye gazing direction, updating the framing content of the current framing frame according to the adjusted focusing position corresponding to the equipment posture or the adjusted focusing position corresponding to the human eye gazing direction.
7. The method according to any one of claims 2 to 5, wherein the photographing control instruction is the function switching instruction;
the executing the shooting control instruction includes:
determining a target shooting function according to the current shooting function and the sliding direction of the first sliding operation;
and switching the current shooting function to the target shooting function.
8. The method according to any one of claims 2 to 5, wherein the determining that the photographing control instruction is a photographing instruction or a photographing stop instruction in response to the operation type being a continuous click operation includes:
in response to that the current shooting function is a shooting function, or the current shooting function is a video recording function and the head-mounted audio-visual equipment is in a non-video recording state, determining the shooting control instruction as the shooting instruction;
and determining the shooting control instruction as the shooting stop instruction in response to the current shooting function being the video recording function and the head-mounted audio-visual equipment being in a video recording state.
9. A shooting control apparatus, characterized in that the apparatus comprises:
the first display module is used for displaying a shooting and viewing interface in an overlapping mode on a real environment picture, and an operation control in the shooting and viewing interface is in a non-triggerable state;
the receiving module is used for receiving control operation acting on the shooting and viewing interface, and the control operation is triggered through a touch area of the head-mounted audio-visual equipment or the control equipment connected with the head-mounted audio-visual equipment;
and the determining module is used for determining a shooting control instruction corresponding to the control operation according to the operation type of the control operation and executing the shooting control instruction, wherein the shooting control instruction comprises a viewfinder frame switching instruction, a function switching instruction, a shooting instruction and a shooting stopping instruction, the viewfinder frame switching instruction is used for indicating the head-mounted audio-visual equipment to determine the priority of a candidate viewfinder frame based on the display state of the augmented reality AR content in the viewfinder content and switch the viewfinder frame according to the priority of the candidate viewfinder frame, and the display state comprises at least one of shape, size and display position.
10. A head-mounted audiovisual device characterized in that it comprises a processor and a memory, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which is loaded and executed by the processor to implement the shooting control method according to any one of claims 1 to 8.
11. A computer-readable storage medium, having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the photographing control method according to any one of claims 1 to 8.
CN202010962809.8A 2020-09-14 2020-09-14 Shooting control method, device, equipment and storage medium Active CN111970456B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010962809.8A CN111970456B (en) 2020-09-14 2020-09-14 Shooting control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010962809.8A CN111970456B (en) 2020-09-14 2020-09-14 Shooting control method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111970456A CN111970456A (en) 2020-11-20
CN111970456B true CN111970456B (en) 2022-01-11

Family

ID=73393219

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010962809.8A Active CN111970456B (en) 2020-09-14 2020-09-14 Shooting control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111970456B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112463016B (en) * 2020-12-09 2023-01-06 Oppo广东移动通信有限公司 Display control method and device, electronic equipment and wearable display equipment
CN112905088B (en) * 2021-02-07 2022-11-04 北京蜂巢世纪科技有限公司 Video shooting control method and device
CN115695768A (en) * 2021-07-26 2023-02-03 北京有竹居网络技术有限公司 Photographing method, photographing apparatus, electronic device, storage medium, and computer program product
CN114189628A (en) * 2021-11-30 2022-03-15 歌尔光学科技有限公司 Control method and device of shooting function, AR device and storage medium
CN116684725B (en) * 2022-10-18 2024-04-16 荣耀终端有限公司 Layout method and device of application interface, electronic equipment, storage medium and chip

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11142920A (en) * 1997-11-10 1999-05-28 Canon Inc Picture frame size switching camera
CN102156606A (en) * 2010-11-17 2011-08-17 华为终端有限公司 Finder frame processing method, picture processing method and user equipment
CN102647552A (en) * 2011-02-22 2012-08-22 华为终端有限公司 Image-taking control method and device
CN106027879A (en) * 2016-04-29 2016-10-12 努比亚技术有限公司 Image acquiring method and apparatus and mobile terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106200899A (en) * 2016-06-24 2016-12-07 北京奇思信息技术有限公司 The method and system that virtual reality is mutual are controlled according to user's headwork
CN107302655B (en) * 2016-09-29 2019-11-01 维沃移动通信有限公司 It is a kind of to shoot the adjusting method and mobile terminal found a view
US10553036B1 (en) * 2017-01-10 2020-02-04 Lucasfilm Entertainment Company Ltd. Manipulating objects within an immersive environment
EP3664424A4 (en) * 2017-08-18 2020-07-15 Huawei Technologies Co., Ltd. Display method and terminal
US10754496B2 (en) * 2017-08-24 2020-08-25 Microsoft Technology Licensing, Llc Virtual reality input
US10964110B2 (en) * 2018-05-07 2021-03-30 Vmware, Inc. Managed actions using augmented reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11142920A (en) * 1997-11-10 1999-05-28 Canon Inc Picture frame size switching camera
CN102156606A (en) * 2010-11-17 2011-08-17 华为终端有限公司 Finder frame processing method, picture processing method and user equipment
CN102647552A (en) * 2011-02-22 2012-08-22 华为终端有限公司 Image-taking control method and device
CN106027879A (en) * 2016-04-29 2016-10-12 努比亚技术有限公司 Image acquiring method and apparatus and mobile terminal

Also Published As

Publication number Publication date
CN111970456A (en) 2020-11-20

Similar Documents

Publication Publication Date Title
CN111970456B (en) Shooting control method, device, equipment and storage medium
CN111541845B (en) Image processing method and device and electronic equipment
CN112286362B (en) Method, system and storage medium for displaying virtual prop in real environment picture
US20180288391A1 (en) Method for capturing virtual space and electronic device using the same
CN107977083B (en) Operation execution method and device based on VR system
CN108038726B (en) Article display method and device
CN106791893A (en) Net cast method and device
CN112073764B (en) Display equipment
EP3299946B1 (en) Method and device for switching environment picture
CN107515669B (en) Display method and device
CN113064684B (en) Virtual reality equipment and VR scene screen capturing method
CN109496293B (en) Extended content display method, device, system and storage medium
CN113938748B (en) Video playing method, device, terminal, storage medium and program product
CN110751707B (en) Animation display method, animation display device, electronic equipment and storage medium
CN112732089A (en) Virtual reality equipment and quick interaction method
CN112783316A (en) Augmented reality-based control method and apparatus, electronic device, and storage medium
CN111782053B (en) Model editing method, device, equipment and storage medium
CN209859042U (en) Wearable control device and virtual/augmented reality system
CN107918514B (en) Display method and device, electronic equipment and computer readable storage medium
CN112905007A (en) Virtual reality equipment and voice-assisted interaction method
KR20170046947A (en) Mobile terminal and method for controlling the same
CN110955328B (en) Control method and device of electronic equipment and storage medium
CN111782056A (en) Content sharing method, device, equipment and storage medium
KR20150071498A (en) Mobile terminal and method for controlling the same
CN117406851A (en) Equipment control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant