CN105630170B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN105630170B
CN105630170B CN201510996998.XA CN201510996998A CN105630170B CN 105630170 B CN105630170 B CN 105630170B CN 201510996998 A CN201510996998 A CN 201510996998A CN 105630170 B CN105630170 B CN 105630170B
Authority
CN
China
Prior art keywords
sub
display
images
image
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510996998.XA
Other languages
Chinese (zh)
Other versions
CN105630170A (en
Inventor
陈悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510996998.XA priority Critical patent/CN105630170B/en
Publication of CN105630170A publication Critical patent/CN105630170A/en
Application granted granted Critical
Publication of CN105630170B publication Critical patent/CN105630170B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an information processing method and electronic equipment, which are used for solving the technical problem that the limitation of the electronic equipment to display content is large. The method comprises the following steps: displaying a first display file in the electronic equipment through a display unit of the electronic equipment according to the received playing operation performed on the electronic equipment; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of the corresponding image; controlling a display unit to sequentially display N sub-images in the process of wearing the electronic equipment by a user; the N sub-images are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer; and generating a second display file corresponding to the N sub-images.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to an information processing method and an electronic device.
Background
With the continuous development of scientific technology, electronic technology has also been developed rapidly, wherein, visual image technology brings many brand new experiences for people, for example, 3D (Three-dimensional) technology brings stereoscopic impression to viewers, 360-degree playing technology makes viewers feel like being personally on the scene, and the like. Meanwhile, electronic devices for supporting these new technologies have also come to be produced.
Currently, when a user plays using an electronic device supporting a 360-degree playing technology, the electronic device may obtain a viewing direction of the user in real time. For example, the electronic device may be a head-mounted electronic device, and the corresponding angle direction may be determined according to a shaking motion of a user up and down or left and right, so as to switch display contents, so that when the viewing angle of the user is at different positions, a display image corresponding to the viewing direction may be displayed on a display screen of the electronic device.
Therefore, in practical applications, when a user uses the electronic device, and different users play the electronic device supporting the 360-degree playing technology, the viewing directions of the users may be different, and therefore, the content viewed by the users may also be different, but the current electronic device arranges the display content viewed by the users, so that the users cannot acquire the observed display content, and further cannot share the content with other users, and even the users themselves cannot easily view the same content again, and the user experience is poor.
Disclosure of Invention
The application provides an information processing method and electronic equipment, which are used for solving the technical problem that the limitation of the electronic equipment to display content is large and improving the experience degree of a user.
An information processing method comprising the steps of:
displaying a first display file in the electronic equipment through a display unit of the electronic equipment according to the received playing operation aiming at the electronic equipment; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of a corresponding image;
controlling the display unit to sequentially display N sub-images in the process that the user wears the electronic equipment; the N sub-images are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer;
and generating a second display file corresponding to the N sub-images.
Optionally, while controlling the display unit to sequentially display the N sub-images, the method further includes:
recording N pieces of display information related to the N sub-images, wherein each piece of display information in the N pieces of display information comprises a parameter of a corresponding sub-image and a display time when the sub-image is displayed through the display unit, and the parameter is used for determining the N sub-images from the first display file;
the generating a second display file corresponding to the N sub-images includes:
and generating the second display file according to the N pieces of display information and the first display file.
Optionally, the parameter corresponding to each sub-image in the N sub-images includes a position parameter of the sub-image in the corresponding image, where the position parameter is used to indicate a position of the sub-image in the corresponding image,
generating the second display file according to the N display information and the first display file comprises:
and generating the second display file according to the first display file and the N corresponding position parameters in the N sub-images.
Optionally, the parameter corresponding to each sub-image in the N sub-images includes an angle parameter corresponding to the sub-image, where the angle parameter is a relative angle between the electronic device and the reference datum determined by detecting the head movement of the user,
generating the second display file according to the N display information and the first display file comprises:
determining N display moments corresponding to the N images according to the first display file;
and packaging the first display file and the N relative angles corresponding to the N images according to the sequence of the N display moments to generate the second display file.
Optionally, the generating a second display file corresponding to the N sub-images includes:
obtaining the N sub-images;
and generating the second display file according to the N sub-images.
Optionally, the obtaining the N sub-images includes:
and intercepting an image corresponding to each display screen in the N display screens of the display unit as the N sub-images.
Optionally, the obtaining the N sub-images includes:
determining the position information of each sub-image in the N sub-images in the corresponding image, and extracting the N sub-images from the first display file according to the position information; and the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
Optionally, after generating the second display file corresponding to the N sub-images, the method further includes:
receiving a play operation for the second display file;
reading a parameter corresponding to each sub-image in the N sub-images according to the playing operation;
searching the N sub-images from the first display file according to the read parameters;
and displaying the N sub-images.
An electronic device, comprising:
a display for displaying a display file in an electronic device;
the processor is used for displaying a first display file in the electronic equipment through the display according to the received playing operation performed on the electronic equipment, controlling the display unit to sequentially display N sub-images in the process that the user wears the electronic equipment, and generating a second display file corresponding to the N sub-images;
the electronic equipment is worn on the head of a user, the electronic equipment can determine a relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, the sub-image is a part of a corresponding image, the N sub-images are the images which are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer.
Optionally, the processor is further configured to:
the method comprises the steps that N pieces of display information related to N sub-images are recorded while the display unit is controlled to sequentially display the N sub-images, wherein each piece of display information in the N pieces of display information comprises a parameter of the corresponding sub-image and a display time when the sub-image is displayed through the display unit, and the parameter is used for determining the N sub-images from a first display file;
and generating the second display file according to the N pieces of display information and the first display file.
Optionally, the parameter corresponding to each sub-image in the N sub-images includes a position parameter of the sub-image in the corresponding image, where the position parameter is used to indicate a position of the sub-image in the corresponding image, and the processor is configured to:
and generating the second display file according to the first display file and the N corresponding position parameters in the N sub-images.
Optionally, the parameter corresponding to each sub-image in the N sub-images includes an angle parameter corresponding to the sub-image, where the angle parameter is a relative angle between the electronic device and the reference datum determined by detecting the head movement of the user, and the processor is configured to:
determining N display moments corresponding to the N images according to the first display file;
and packaging the first display file and the N relative angles corresponding to the N images according to the sequence of the N display moments to generate the second display file.
Optionally, the processor is configured to:
obtaining the N sub-images;
and generating the second display file according to the N sub-images.
Optionally, the processor is configured to:
and intercepting an image corresponding to each display screen in the N display screens of the display unit as the N sub-images.
Optionally, the processor is configured to:
determining the position information of each sub-image in the N sub-images in the corresponding image, and extracting the N sub-images from the first display file according to the position information; and the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
Optionally, the processor is configured to:
after generating a second display file corresponding to the N sub-images, reading a parameter corresponding to each sub-image in the N sub-images according to a received playing operation for the second display file, and searching and displaying the N sub-images from the first display file according to the read parameters.
An electronic device, comprising:
the operation module is used for displaying a first display file in the electronic equipment through a display unit of the electronic equipment according to the received playing operation aiming at the electronic equipment; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of a corresponding image;
the control module is used for controlling the display unit to sequentially display the N sub-images in the process that the user wears the electronic equipment; the N sub-images are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer;
and the generating module is used for generating a second display file corresponding to the N sub-images.
In the application, according to a received playing operation performed on the electronic device, a first display file in the electronic device is displayed through a display unit, wherein the electronic device is worn on the head of a user, and along with the operation of the head of the user, a relative angle of the electronic device with respect to a reference datum can be determined, and then a sub-image corresponding to the display unit in an image included in the first display file can be determined according to the relative angle, so that in the process of wearing the electronic device by the user, the display unit can be controlled to sequentially display N sub-images, which are images determined by the electronic device in the first display file according to the head movement of the user and displayed through the display unit, and then a second display file corresponding to the N sub-images, which is a second display file related to the wearer, is generated, so that the N sub-images can be viewed again through the second display file, even sharing to other users, etc. so that the electronic equipment has better sorting effect on the displayed content and higher user experience.
Drawings
FIG. 1 is a schematic diagram of an electronic device in an embodiment of the invention;
FIG. 2 is a schematic diagram of the relative angle between the user's gaze and a reference datum in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a display unit displaying sub-images according to an embodiment of the present invention;
FIG. 4 is a main flowchart of an information processing method according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
fig. 6 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The application provides an information processing method and electronic equipment, which are used for solving the problem that the limitation of the electronic equipment to display content is large and improving the experience degree of a user.
In order to solve the technical problems, the technical scheme provided by the application has the following general idea:
in the application, according to a received playing operation performed on the electronic device, a first display file in the electronic device is displayed through a display unit, wherein the electronic device is worn on the head of a user, and along with the operation of the head of the user, a relative angle of the electronic device with respect to a reference datum can be determined, and then a sub-image corresponding to the display unit in an image included in the first display file can be determined according to the relative angle, so that in the process of wearing the electronic device by the user, the display unit can be controlled to sequentially display N sub-images, which are images determined by the electronic device in the first display file according to the head movement of the user and displayed through the display unit, and then a second display file corresponding to the N sub-images, which is a second display file related to the wearer, is generated, so that the N sub-images can be viewed again through the second display file, even sharing to other users, etc. so that the electronic equipment has better sorting effect on the displayed content and higher user experience.
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In an embodiment of the present invention, the electronic device may be an electronic device supporting a 360-degree playing technology, and preferably, the electronic device may be a device having a wearable structure, as shown in fig. 1, which is a schematic diagram of the electronic device having a wearable structure according to the present invention, where a numeral 1 represents the electronic device, and a numeral 2 represents a fixing structure for fixing the electronic device, and the fixing structure may include a socket matching with the electronic device. For convenience of describing the information processing method in the embodiment of the present invention, an electronic device supporting a 360-degree playing technology and a display mode of the electronic device in the embodiment of the present invention will be described first.
In practical applications, the electronic device may include a display unit, and when the electronic device is used, the user fixes the electronic device on the head through the fixing structure, and in this case, the display unit is located in front of the eyes of the user. Since the electronic device is fixed on the head of the user, when the user moves the head, the electronic device moves along with the head.
In practical application, the electronic device further comprises a sensor for detecting the relative position of the sight line of the user and the reference datum. Specifically, in the embodiment of the present invention, the reference may be an axis passing through the center of the head of the user and perpendicular to the ground, or an axis pointing in a fixed direction. The sensor may be a nine-axis gravity sensor or the like. Those skilled in the art to which the present invention pertains may set the present invention according to the practice, and the present invention is not particularly limited.
Generally, when a user changes a viewing sight line, the head of the user may move correspondingly, for example, when the user looks to the left, the head of the user correspondingly turns to the left, so in the embodiment of the present invention, a relative position of the detected sight line of the user and the reference may be considered as a relative position of the electronic device and the reference, and an angle corresponding to the relative position is a relative angle.
Optionally, the electronic device detects a relative position between the user's sight line and the reference datum through a sensor, and then presents a part of the image corresponding to the relative angle in the first display file on the display unit. Specifically, the image included in the first display file in the embodiment of the present application may be a 360-degree annular image content, as shown in fig. 2, which is the display content in the horizontal direction in the first display file, where X is a reference, S is a corresponding viewing line of sight when the head of the user moves, and β is a value of a relative angle formed between the viewing line of sight of the user and the reference.
Generally speaking, the annular image content of 360 degrees may be different at different playing time, and the display unit can only display a portion of a preset angle (e.g. α degrees) in the annular image content of 360 degrees at one time, where 0 < α < 360. The electronic apparatus determines from the display contents that a portion corresponding to α degrees is to be displayed on the display unit in correspondence with the relative angle (i.e., β degrees).
For example, assuming that the displayed image is a ring image as shown in fig. 2, and the relative angle between the viewing lines S and X of the user is a clockwise angle β degrees, the electronic device may determine the image of α degrees corresponding to the relative angle in fig. 2 as a sub-image corresponding to the display unit, and display the sub-image as shown in fig. 3.
When the user moves the head, the electronic equipment obtains the relative angle after the movement according to the mode, and then the sub-image displayed on the display unit is adjusted, so that when the relative angle of the electronic equipment is changed along with the head movement of the user, different display parts can be seen, and the experience of being personally on the scene is generated.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
As shown in fig. 4, an embodiment of the present invention discloses an information processing method, which may be described as follows.
S11: displaying a first display file in the electronic equipment through a display unit of the electronic equipment according to the received playing operation performed on the electronic equipment; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of the corresponding image;
s12: controlling a display unit to sequentially display N sub-images in the process of wearing the electronic equipment by a user; the N sub-images are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer;
s13: and generating a second display file corresponding to the N sub-images.
In this embodiment of the present invention, the received play operation may be an operation performed by a user to control the electronic device to play a first display file, and the first display file may be a video file for 360-degree play. For example, after the user wears the electronic device on the head, a video file in the electronic device may be played by operating the corresponding button.
During the process that the user wears the electronic equipment, the relative angle between the electronic equipment and the reference datum can be sensed in real time through the sensor, and therefore the sub-image corresponding to the display unit is determined in the image included in the first display file according to the relative angle. Therefore, the N sub-images may be screens sequentially displayed in the display unit determined according to the relative angle during the wearing process of the user.
In practical applications, each of the N sub-images has a corresponding image (i.e., a ring image) in the first display file, and each sub-image is a portion of the corresponding image.
Optionally, when the display unit is controlled to sequentially display the N sub-images, N pieces of display information related to the N sub-images may be recorded, and each piece of display information in the N pieces of display information may include a parameter of a corresponding sub-image and a corresponding display time when the image is displayed by the display unit, where the parameter may be used to determine the N sub-images from the first display file.
At this time, the process of S13 may be to generate the second display file from the N display information and the first display file.
Optionally, according to different parameters included in the display information, the manner of generating the second display file may include, but is not limited to, the following two cases.
The first condition is as follows: the parameter corresponding to each sub-image in the N sub-images comprises a position parameter of the sub-image in the corresponding image, the position parameter is used for indicating the position of the sub-image in the corresponding image, and at the moment, a second display file is generated according to the first display file and the N corresponding position parameters in the N sub-images.
The position parameter may be a position parameter corresponding to the determined sub-image in the corresponding image during the process of displaying the N sub-images by the electronic device. Alternatively, the position parameter may be represented by a pixel coordinate, and of course, the electronic device may also use other manners to represent the position of the sub-image in the corresponding image, which is not limited in this embodiment of the invention.
In the process of playing the first display file, each image contained in the first display file corresponds to a corresponding display time, and the sub-image corresponding to the display picture of the display unit is a part of the image, so that each sub-image also has a corresponding display time and is recorded in the display information.
For example, if the total duration of the first display file is 02:00:00, when the first sub-image in the first display file is displayed through the display unit, and the first sub-image is a part of the first image included in the first display file, if it is found that the corresponding display time of the first image in the first display file is 01:15:30, since the first sub-image is a part of the first image, the display time of the first sub-image recorded in the display information is the same as the display time corresponding to the first image, that is, 01:15: 30.
Furthermore, by combining the corresponding position parameters of each sub-image recorded in the display information in the image included in the first display file, the content (i.e., N sub-images) viewed by the user in the first display file can be determined, and even the content can be correspondingly identified, so as to distinguish the viewing contents of different users. At this time, the generated second display file is the N position parameters corresponding to the first display file and including the N sub-images.
Case two: the parameters corresponding to each sub-image in the N sub-images comprise angle parameters corresponding to the sub-images, and the angle parameters are relative angles between the electronic equipment and a reference datum determined by detecting head movement of a user. At this time, N display times corresponding to the N images may be determined according to the first display file; and then, packaging the first display file and the N corresponding angles corresponding to the N images according to the sequence of the N display moments to generate a second display file. That is, the second display file generated at this time is the file having the N display times corresponding to the N sub-images identified to the first display file.
It should be noted that the second display file generated in the above case is the first display file with the corresponding identifier (i.e., the recorded display information), and when the second display file is played, a display can be provided for the user based on the corresponding identifier, so that the user can view the N sub-images identified in the display file.
Wherein, the playing the second display file may include: and receiving a playing operation aiming at the second display file, reading a parameter corresponding to each sub-image in the N sub-images according to the playing operation, searching the N sub-images from the first display file according to the read parameters, and displaying the N sub-images.
In practical application, when the second display file is played, it is equivalent to need to read and display the corresponding sub-image from the first display file according to the parameter and the display time corresponding to each sub-image in the N sub-images. For example, if the angle parameter in the display information corresponding to the first sub-image is 40 degrees and the display time is 01:11:23, the first image corresponding to the display time in the source display file may be determined, and then the image portion of the first sub-image corresponding to the first image may be read according to the angle parameter, so as to complete the display of the first sub-image.
On the other hand, in the embodiment of the present invention, after controlling the display unit to sequentially display the N sub-images, the process of generating the second display file may be: and obtaining N sub-images in the first display file, and combining the N sub-images to generate a second display file.
The following two methods may be adopted to obtain the N sub-images in the first display file.
The method comprises the following steps: and intercepting an image corresponding to each display screen in the N display screens of the display unit as N sub-images.
That is, when the display unit sequentially displays the N sub-images, the display frames corresponding to the display unit may be captured one by one, and the display image corresponding to the display frame is the corresponding sub-image, so as to obtain the N sub-images.
The method 2 comprises the following steps: determining the position information of each sub-image in the N sub-images in the corresponding image, and extracting the N sub-images from the first display file according to the position information; the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
Here, the position information may be determined according to the determined relative angle and a preset angle (i.e., α degrees) corresponding to the first display file in the display process, and since the preset angle may be a fixed angle, as long as the relative angle corresponding to the sub-image is determined, the position of the corresponding sub-image in the image is also determined, that is, the positioning of the sub-image in the corresponding image is realized, and further, when the N sub-images are extracted, each sub-image can be obtained according to the relative position and the preset angle.
For example, if the preset angle (i.e., α degree) corresponding to the first display file is 40 degrees, when the user performs the head operation, if it is detected that the relative angle (i.e., β degree) between the electronic device and the reference datum is 30 degrees at a certain time in the process, the corresponding position of the sub-image corresponding to the display unit in the corresponding image at the time can be determined according to the preset angle and the relative angle, and the corresponding sub-image is extracted.
After the N sub-images are extracted by any of the above methods, the N sub-images may be sorted according to the order of the time of acquisition. Since the N sub-images are obtained by displaying in real time according to the head movement of the user, the order of the obtaining times corresponding to the N sub-images can be regarded as the corresponding display order.
For example, if when the user performs head movement, the relative angle formed by the electronic device and the reference datum is 60 °, and the sub-images successively obtained at the angle include corresponding three sub-images, where the obtaining time corresponding to the first sub-image is 01:12:06, the obtaining time corresponding to the second sub-image is 01:12:07, and the obtaining time corresponding to the third sub-image is 01:12:08, so that the arrangement order of the three sub-images in the second display file can be determined according to the obtaining times.
In practical application, each sub-image in the N sub-images may also correspond to a display time in the first display file, and therefore, a display order corresponding to the N sub-images in the second display file may also be determined according to the N display times of the N sub-images.
Because the head motions of different users watching the first display file may be different, different users wearing the electronic device can generate different second display files correspondingly, so that the file content watched by each user can be reproduced, the user can share the display content watched by the user or watch the display content again, and the user experience is high.
Two usage scenarios of the information processing method in the embodiment of the present invention are described below.
Scene one: the recording function is selected while the 360-degree video file is played, so that the display information of each sub-image is recorded while the video file is played, the display information can include display time and angle parameters (or position information) corresponding to each sub-image, and further a personalized playing file (namely, a second display file) is generated according to the display information, the image included in the generated second display file can be the same as the image included in the first display file, but the display information corresponding to the sub-image record (even identified) corresponding to the display unit in the second display file.
If the user or other users choose to play the second display file, the sub-images corresponding to the display information can be extracted from the source video file according to the display information for playing, and when the N sub-images are played according to the display information, the electronic equipment can automatically change the display angle of each sub-image, so that the same original video can be played individually according to different playing files.
Scene two: the method comprises the steps that a recording function is selected while an electronic device supporting a 360-degree playing technology is used for playing a video file (namely, a first display file), the function can intercept currently seen picture data (namely, sub-images) while a user plays the video file, and therefore the intercepted sub-images are combined to generate a common video file (namely, a second display file). Therefore, the user can share the generated common video file, and when the user or other users play the second display file, the plurality of sub-images contained in the second display file are sequentially displayed according to the sequence of the interception time.
As shown in fig. 5, based on the same inventive concept, the embodiment of the present invention also discloses an electronic device, which includes a display 10 and a processor 20.
In particular, the display 10 may be used to display files in an electronic device, such as video.
The processor 20 may be configured to display a first display file in the electronic device through the display according to the received play operation performed on the electronic device, and control the display 10 to sequentially display the N sub-images and generate a second display file corresponding to the N sub-images in the process that the user wears the electronic device; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference datum according to the head movement of the user, and determine a sub-image corresponding to the display 10 in an image included in the first display file according to the relative angle, the sub-image is a part of the corresponding image, the N sub-images are images which are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display 10, and N is a positive integer.
Optionally, the processor 20 may be further configured to: recording N display information related to the N sub-images while controlling the display 10 to sequentially display the N sub-images, wherein each display information of the N display information comprises a parameter of a corresponding sub-image and a display time when the sub-image is displayed by the display 10, and the parameter is used for determining the N sub-images from a first display file; and generating a second display file according to the N pieces of display information and the first display file.
Optionally, if the parameter corresponding to each sub-image in the N sub-images includes a position parameter of the sub-image in the corresponding image, where the position parameter is used to indicate a position of the sub-image in the corresponding image, the processor 20 may be configured to:
and generating a second display file according to the first display file and the N corresponding position parameters in the N sub-images.
Optionally, if the parameter corresponding to each sub-image in the N sub-images includes an angle parameter corresponding to the sub-image, where the angle parameter is a relative angle between the electronic device and the reference datum determined by detecting the head movement of the user, the processor 20 may be configured to:
determining N display moments corresponding to the N images according to the first display file;
and packaging the first display file and the N relative angles corresponding to the N images according to the sequence of the N display moments to generate a second display file.
Optionally, the processor 20 may be configured to: and obtaining N sub-images, and generating a second display file according to the N sub-images.
Optionally, the processor 20 is configured to: an image corresponding to each of the N display screens of the display 10 is captured as N sub-images.
Optionally, the processor 20 may be configured to:
determining the position information of each sub-image in the N sub-images in the corresponding image, and extracting the N sub-images from the first display file according to the position information; the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
Optionally, the processor 20 may be further configured to, after generating a second display file corresponding to the N sub-images, read a parameter corresponding to each sub-image in the N sub-images according to a received play operation for the second display file, and search and display the N sub-images from the first display file according to the read parameter.
As shown in fig. 6, based on the same inventive concept, the embodiment of the present invention further discloses an electronic device, which includes an operation module 301, a control module 302, and a generation module 303.
The operation module 301 may be configured to display a first display file in the electronic device through a display unit of the electronic device according to the received play operation performed on the electronic device; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of the corresponding image;
the control module 302 may be configured to control the display unit to sequentially display the N sub-images in a process that the user wears the electronic device; the electronic equipment determines an image which is used for being displayed through a display unit and is displayed in a first display file according to the head movement of a user, wherein N is a positive integer;
the generating module 303 may be configured to generate a second display file corresponding to the N sub-images.
Optionally, the electronic device may further include a recording module, configured to record N pieces of display information related to the N sub-images while controlling the display unit to sequentially display the N sub-images, where each piece of display information in the N pieces of display information includes a parameter of a corresponding sub-image and a display time when the sub-image is displayed by the display unit, and the parameter is used to determine the N sub-images from the first display file.
At this time, the generating module 303 is configured to: and generating a second display file according to the N pieces of display information and the first display file.
On one hand, if the parameter corresponding to each sub-image in the N sub-images includes a position parameter of the sub-image in the corresponding image, where the position parameter is used to indicate a position of the sub-image in the corresponding image, the generating module 303 is configured to:
and generating a second display file according to the first display file and the N corresponding position parameters in the N sub-images.
On the other hand, if the parameter corresponding to each sub-image in the N sub-images includes an angle parameter corresponding to the sub-image, where the angle parameter is a relative angle between the electronic device and the reference datum determined by detecting the head movement of the user, the generating module 303 is configured to:
determining N display moments corresponding to the N images according to the first display file;
and packaging the first display file and the N relative angles corresponding to the N images according to the sequence of the N display moments to generate a second display file.
Optionally, the generating module 303 may be configured to: and obtaining N sub-images, and generating a second display file according to the N sub-images.
Optionally, when the N sub-images are acquired, the generating module 303 may be configured to intercept an image corresponding to each display screen in the N display screens of the display unit as the N sub-images, or determine position information of each sub-image in the corresponding image in the N sub-images, and extract the N sub-images from the first display file according to the position information; the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
Optionally, after generating the second display file corresponding to the N sub-images, the operation module 301 may be configured to: and receiving a playing operation aiming at the second display file, reading a parameter corresponding to each sub-image in the N sub-images according to the playing operation, and searching and displaying the N sub-images from the first display file according to the read parameters.
Specifically, the computer program instructions corresponding to the information processing method in the embodiment of the present application may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the computer program instructions corresponding to the information processing method in the storage medium are read or executed by an electronic device, the method includes the following steps:
displaying a first display file in the electronic equipment through a display unit of the electronic equipment according to the received playing operation aiming at the electronic equipment; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of a corresponding image;
controlling the display unit to sequentially display N sub-images in the process that the user wears the electronic equipment; the N sub-images are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer;
and generating a second display file corresponding to the N sub-images.
Optionally, the storage medium further stores other computer instructions, and the computer instructions perform the following steps: the display unit is controlled to sequentially display instructions corresponding to the N sub-images to be executed at the same time, and the method comprises the following steps:
and recording N pieces of display information related to the N sub-images, wherein each piece of display information in the N pieces of display information comprises a parameter of a corresponding sub-image and a display time when the sub-image is displayed through the display unit, and the parameter is used for determining the N sub-images from the first display file.
Optionally, the step of storing in the storage medium: generating a computer instruction corresponding to the second display file corresponding to the N sub-images, wherein the computer instruction comprises the following steps in a specific executed process:
generating the second display file according to the N pieces of display information and the first display file
Optionally, if the parameter corresponding to each sub-image in the N sub-images includes a position parameter of the sub-image in the corresponding image, where the position parameter is used to indicate a position of the sub-image in the corresponding image, the storage medium stores and the steps of: generating the second display file according to the N display information and the first display file, wherein the corresponding computer instructions comprise the following steps in the specific executed process:
and generating the second display file according to the first display file and the N corresponding position parameters in the N sub-images.
Optionally, if the parameter corresponding to each sub-image in the N sub-images includes an angle parameter corresponding to the sub-image, where the angle parameter is a relative angle between the electronic device and the reference datum determined by detecting the head movement of the user, and the step stored in the storage medium is that: generating the second display file according to the N display information and the first display file, wherein the corresponding computer instructions comprise the following steps in the specific executed process:
determining N display moments corresponding to the N images according to the first display file;
and packaging the first display file and the N relative angles corresponding to the N images according to the sequence of the N display moments to generate the second display file.
Optionally, the step of storing in the storage medium: generating a second display file corresponding to the N sub-images, wherein the corresponding computer instructions comprise the following steps in the specific executed process:
obtaining the N sub-images;
and generating the second display file according to the N sub-images.
Optionally, the step of storing in the storage medium: the method for obtaining the computer instructions corresponding to the N sub-images comprises the following steps in a specific executed process:
and intercepting an image corresponding to each display screen in the N display screens of the display unit as the N sub-images.
Optionally, the step of storing in the storage medium: the method for obtaining the computer instructions corresponding to the N sub-images comprises the following steps in a specific executed process:
determining the position information of each sub-image in the N sub-images in the corresponding image, and extracting the N sub-images from the first display file according to the position information; and the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
Optionally, the storage medium further stores other computer instructions, and the computer instructions perform the following steps: the instructions corresponding to the second display files corresponding to the N sub-images are generated and executed, and the instructions further comprise the following steps:
receiving a play operation for the second display file;
reading a parameter corresponding to each sub-image in the N sub-images according to the playing operation;
searching the N sub-images from the first display file according to the read parameters;
and displaying the N sub-images.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (15)

1. An information processing method comprising:
displaying a first display file in the electronic equipment through a display unit of the electronic equipment according to the received playing operation aiming at the electronic equipment; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of a corresponding image;
controlling the display unit to sequentially display N sub-images in the process that the user wears the electronic equipment; the N sub-images are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer; wherein, while controlling the display unit to sequentially display the N sub-images, the method further comprises:
recording N pieces of display information related to the N sub-images, wherein each piece of display information in the N pieces of display information comprises a parameter of a corresponding sub-image and a display time when the N pieces of display information are displayed through the display unit, the parameter is used for determining the N sub-images from the first display file, the parameter corresponding to each sub-image in the N pieces of sub-images comprises an angle parameter corresponding to the sub-image, and the angle parameter is a relative angle between the electronic equipment and the reference standard determined by detecting the head movement of the user;
determining N display moments corresponding to the N sub-images according to the first display file;
and packaging the first display file and the N relative angles corresponding to the N sub-images according to the sequence of the N display moments to generate a second display file corresponding to the N sub-images.
2. The method of claim 1, wherein the generating a second display file corresponding to the N sub-images comprises:
and generating the second display file according to the N pieces of display information and the first display file.
3. The method of claim 2, wherein the parameter corresponding to each of the N sub-images comprises a position parameter of the sub-image in the corresponding image, the position parameter indicating a position of the sub-image in the corresponding image,
generating the second display file according to the N display information and the first display file comprises:
and generating the second display file according to the first display file and the N corresponding position parameters in the N sub-images.
4. The method of claim 1, wherein the generating a second display file corresponding to the N sub-images comprises:
obtaining the N sub-images;
and generating the second display file according to the N sub-images.
5. The method of claim 4, wherein the obtaining the N sub-images comprises:
and intercepting an image corresponding to each display screen in the N display screens of the display unit as the N sub-images.
6. The method of claim 4, wherein the obtaining the N sub-images comprises:
determining the position information of each sub-image in the N sub-images in the corresponding image, and extracting the N sub-images from the first display file according to the position information; and the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
7. The method of claim 3, wherein after generating the second display file corresponding to the N sub-images, the method further comprises:
receiving a play operation for the second display file;
reading a parameter corresponding to each sub-image in the N sub-images according to the playing operation;
searching the N sub-images from the first display file according to the read parameters;
and displaying the N sub-images.
8. An electronic device, comprising:
a display unit for displaying a display file in an electronic device;
the processor is used for displaying a first display file in the electronic equipment through the display unit according to the received playing operation performed on the electronic equipment, controlling the display unit to sequentially display N sub-images in the process that a user wears the electronic equipment, and generating a second display file corresponding to the N sub-images;
the electronic equipment is worn on the head of a user, the electronic equipment can determine a relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, the sub-image is a part of a corresponding image, the N sub-images are the images which are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer;
wherein the processor is further configured to:
the method comprises the steps that N pieces of display information related to N sub-images are recorded while the display unit is controlled to sequentially display the N sub-images, each piece of display information in the N pieces of display information comprises a parameter of a corresponding sub-image and a display moment when the sub-images are displayed through the display unit, the parameter is used for determining the N sub-images from a first display file, the parameter corresponding to each sub-image in the N sub-images comprises an angle parameter corresponding to the sub-image, the angle parameter is a relative angle between the electronic equipment determined by detecting head movement of a user and a reference datum, and the N pieces of display moments corresponding to the N sub-images are determined according to the first display file; and packaging the first display file and the N relative angles corresponding to the N sub-images according to the sequence of the N display moments to generate a second display file corresponding to the N sub-images.
9. The electronic device of claim 8,
and generating the second display file according to the N pieces of display information and the first display file.
10. The electronic device of claim 9, wherein the parameter corresponding to each of the N sub-images comprises a position parameter of the sub-image in the corresponding image, the position parameter indicating a position of the sub-image in the corresponding image, and the processor is configured to:
and generating the second display file according to the first display file and the N corresponding position parameters in the N sub-images.
11. The electronic device of claim 8, wherein the processor is to:
obtaining the N sub-images;
and generating the second display file according to the N sub-images.
12. The electronic device of claim 11, wherein the processor is to:
and intercepting an image corresponding to each display screen in the N display screens of the display unit as the N sub-images.
13. The electronic device of claim 11, wherein the processor is to:
determining the position information of each sub-image in the N sub-images in the corresponding image, and extracting the N sub-images from the first display file according to the position information; and the position information is information which is used for representing the position of the sub-image in the corresponding image and is determined according to the corresponding relative angle of the sub-image.
14. The electronic device of claim 10, wherein the processor is further to: after generating a second display file corresponding to the N sub-images, reading a parameter corresponding to each sub-image in the N sub-images according to a received playing operation for the second display file, and searching and displaying the N sub-images from the first display file according to the read parameters.
15. An electronic device, comprising:
the operation module is used for displaying a first display file in the electronic equipment through a display unit of the electronic equipment according to the received playing operation aiming at the electronic equipment; the electronic equipment is worn on the head of a user, the electronic equipment can determine the relative angle of the electronic equipment relative to a reference standard according to the head movement of the user, and determine a sub-image corresponding to the display unit in the image included in the first display file according to the relative angle, wherein the sub-image is a part of a corresponding image;
the control module is used for controlling the display unit to sequentially display the N sub-images in the process that the user wears the electronic equipment; the N sub-images are determined in the first display file by the electronic equipment according to the head movement of the user and are displayed through the display unit, and N is a positive integer; the method comprises the steps of controlling a display unit to sequentially display N sub-images, and simultaneously recording N pieces of display information related to the N sub-images, wherein each piece of display information in the N pieces of display information comprises a parameter of a corresponding sub-image and a display time when the sub-images are displayed through the display unit, the parameter is used for determining the N sub-images from a first display file, the parameter corresponding to each sub-image in the N pieces of sub-images comprises an angle parameter corresponding to the sub-image, and the angle parameter is a relative angle between the electronic equipment and a reference datum determined by detecting head movement of a user;
the generating module is used for determining N display moments corresponding to the N sub-images according to the first display file; and packaging the first display file and the N relative angles corresponding to the N sub-images according to the sequence of the N display moments to generate a second display file corresponding to the N sub-images.
CN201510996998.XA 2015-12-25 2015-12-25 Information processing method and electronic equipment Active CN105630170B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510996998.XA CN105630170B (en) 2015-12-25 2015-12-25 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510996998.XA CN105630170B (en) 2015-12-25 2015-12-25 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105630170A CN105630170A (en) 2016-06-01
CN105630170B true CN105630170B (en) 2019-12-24

Family

ID=56045198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510996998.XA Active CN105630170B (en) 2015-12-25 2015-12-25 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105630170B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872390A (en) * 2016-06-15 2016-08-17 南京快脚兽软件科技有限公司 Equipment for efficiently checking peripheral environment images

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6011526A (en) * 1996-04-15 2000-01-04 Sony Corporation Display apparatus operable in synchronism with a movement of the body of a viewer
CN102014236A (en) * 2009-09-08 2011-04-13 鸿富锦精密工业(深圳)有限公司 Interactive image playing system and method
CN102063249A (en) * 2009-11-16 2011-05-18 美国博通公司 Communication method and system
CN103856716A (en) * 2012-12-06 2014-06-11 三星电子株式会社 Display apparatus for displaying images and method thereof
CN104270623A (en) * 2014-09-28 2015-01-07 联想(北京)有限公司 Display method and electronic device
CN104703017A (en) * 2015-02-09 2015-06-10 联想(北京)有限公司 Display control method and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020060691A1 (en) * 1999-11-16 2002-05-23 Pixel Kinetix, Inc. Method for increasing multimedia data accessibility
KR101806891B1 (en) * 2011-04-12 2017-12-08 엘지전자 주식회사 Mobile terminal and control method for mobile terminal
KR101315303B1 (en) * 2011-07-11 2013-10-14 한국과학기술연구원 Head mounted display apparatus and contents display method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6011526A (en) * 1996-04-15 2000-01-04 Sony Corporation Display apparatus operable in synchronism with a movement of the body of a viewer
CN102014236A (en) * 2009-09-08 2011-04-13 鸿富锦精密工业(深圳)有限公司 Interactive image playing system and method
CN102063249A (en) * 2009-11-16 2011-05-18 美国博通公司 Communication method and system
CN103856716A (en) * 2012-12-06 2014-06-11 三星电子株式会社 Display apparatus for displaying images and method thereof
CN104270623A (en) * 2014-09-28 2015-01-07 联想(北京)有限公司 Display method and electronic device
CN104703017A (en) * 2015-02-09 2015-06-10 联想(北京)有限公司 Display control method and electronic equipment

Also Published As

Publication number Publication date
CN105630170A (en) 2016-06-01

Similar Documents

Publication Publication Date Title
CN106797460B (en) The reconstruction of 3 D video
CN109743892B (en) Virtual reality content display method and device
CN109416931A (en) Device and method for eye tracking
JP6845111B2 (en) Information processing device and image display method
CN108292489A (en) Information processing unit and image generating method
CN109246463B (en) Method and device for displaying bullet screen
US9129657B2 (en) Video image display apparatus, video image display method, non-transitory computer readable medium, and video image processing/display system for video images of an object shot from multiple angles
CN104349155B (en) Method and equipment for displaying simulated three-dimensional image
CN109154862B (en) Apparatus, method, and computer-readable medium for processing virtual reality content
KR20140082610A (en) Method and apaaratus for augmented exhibition contents in portable terminal
JP2014095809A (en) Image creation method, image display method, image creation program, image creation system, and image display device
KR101198557B1 (en) 3D stereoscopic image and video that is responsive to viewing angle and position
US20190347864A1 (en) Storage medium, content providing apparatus, and control method for providing stereoscopic content based on viewing progression
CN114327700A (en) Virtual reality equipment and screenshot picture playing method
JP2017208676A (en) Method of providing virtual space, program, and recording medium
CN113382224B (en) Interactive handle display method and device based on holographic sand table
US11187895B2 (en) Content generation apparatus and method
JP6515512B2 (en) Display device, display device calibration method, and calibration program
CN113170231A (en) Method and device for controlling playing of video content following user motion
CN111699460A (en) Multi-view virtual reality user interface
WO2021015035A1 (en) Image processing apparatus, image delivery system, and image processing method
JP5563625B2 (en) Image analysis apparatus, image analysis method, and image analysis system
CN105630170B (en) Information processing method and electronic equipment
US11128836B2 (en) Multi-camera display
JP2017208808A (en) Method of providing virtual space, program, and recording medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant