CN113168823A - Display control method, electronic device, and computer-readable storage medium - Google Patents

Display control method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN113168823A
CN113168823A CN201880097634.XA CN201880097634A CN113168823A CN 113168823 A CN113168823 A CN 113168823A CN 201880097634 A CN201880097634 A CN 201880097634A CN 113168823 A CN113168823 A CN 113168823A
Authority
CN
China
Prior art keywords
display screen
area
range
image
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880097634.XA
Other languages
Chinese (zh)
Inventor
李友
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Royole Technologies Co Ltd
Original Assignee
Shenzhen Royole Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Royole Technologies Co Ltd filed Critical Shenzhen Royole Technologies Co Ltd
Publication of CN113168823A publication Critical patent/CN113168823A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports

Abstract

A display control method, an electronic device (50), and a computer-readable storage medium are disclosed. The display control method includes: controlling an image sensor to acquire an image (100) within a preset area range relative to a body; judging whether the image contains human body features (102); if yes, acquiring position information (104) of a user corresponding to the human body characteristics within the preset area range; and determining the visual range of the user relative to the display screen according to the position information, and controlling the display screen (106) to be displayed on the area of the display screen corresponding to the visual range. According to the embodiment of the application, the position information of the user in the range relative to the preset area is obtained, and the picture with the preset display range is displayed according to the position information, so that when the position of the user changes, the position of the content displayed on the screen can be adjusted, and the user experience can be improved.

Description

Display control method, electronic device, and computer-readable storage medium Technical Field
The present application relates to the field of intelligent devices, and in particular, to a display control method, an electronic device, and a computer-readable storage medium.
Background
Along with the development of science and technology, electronic equipment's structural style is more and more various, gives people and brings abundanter use experience. For example, a display screen with a certain curvature may be attached to the outer side of the cylindrical electronic device, and then a picture may be displayed on the outer side of the cylindrical electronic device for a user to view.
However, since the position of the picture displayed by the display screen on the cylindrical electronic device is not changed, the user may not be able to view the picture displayed on the display screen when the user is at different positions of the cylindrical electronic device, that is, the user may not be able to view the picture displayed by the display screen at any position, which reduces the user experience.
Disclosure of Invention
The present application is directed to a display control method, an electronic device, and a computer-readable storage medium, which are convenient for a user to view a picture displayed on a display screen.
A first aspect of an embodiment of the present application provides a display control method, which is applied to an electronic device, where the electronic device has a cylindrical body and a display screen disposed around the body, and includes:
controlling an image sensor to acquire an image in a preset area range relative to the body;
judging whether the image contains human body features or not;
when the image contains the human body characteristics, acquiring position information of a user corresponding to the human body characteristics within the preset area range;
and determining the visual range of the user relative to the display screen according to the position information, and controlling the display screen to display the picture on the area of the display screen corresponding to the visual range.
A second aspect of embodiments of the present application provides an electronic device, including a cylindrical body, the electronic device including:
the image sensor is used for acquiring an image in a preset area range relative to the body;
the display screen is arranged around the body;
the processor is connected with the image sensor and the display screen and is used for judging whether the image contains human body characteristics; when the image contains the human body characteristics, the processor is used for acquiring the position information of the user corresponding to the human body characteristics within the preset area range; the processor is further used for determining a visual range of a user relative to the display screen according to the position information and controlling a picture to be displayed on an area of the display screen corresponding to the visual range.
A third aspect of embodiments of the present application provides a computer-readable storage medium, which stores computer instructions, where the computer instructions, when executed by a processor, implement some or all of the steps as described in any of the methods of the first aspect of embodiments of the present application.
Compared with the prior art, the embodiment of the application provides a display control method, an electronic device and a computer-readable storage medium, and a picture with a preset display range is displayed according to position information by acquiring the position information of a user relative to a preset area range, so that when the position of the user changes, the position of content displayed on a screen can be adjusted, the displayed content can fall into a visual range watched by the user, and the user experience can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart illustrating steps of a display control method according to an embodiment of the present application.
Fig. 2 is a schematic diagram of an electronic device in an embodiment of the present application.
Fig. 3 is a schematic diagram of the position of an image sensor in an embodiment of the present application.
Fig. 4 is a schematic diagram of a shooting angle of view of an image sensor in an embodiment of the present application.
FIG. 5 is a schematic diagram of an image taken by the image sensing arrangement in an embodiment of the present application.
Fig. 6 is a flowchart illustrating steps of a display control method according to another embodiment of the present application.
Fig. 7 is a block diagram of a hardware structure of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The following describes embodiments of the present application in detail.
Referring to fig. 1, a flowchart illustrating steps of a display control method according to an embodiment of the present application is shown. The display control method is operated on an electronic device. Fig. 2 is a schematic view of an electronic device according to an embodiment of the present application. The electronic device 50 has a cylindrical body 580, a display 582, and an image sensor 584. Wherein the display 582 surrounds the body 580 and provides all or a portion of the body 580. The image sensor 584 is disposed on the body 580. In this embodiment, the display 582 may be a flexible display that may be fitted around the body 580. In other embodiments, the electronic device 50 may include a plurality of display screens, each of which is disposed at a corresponding position of the body 580. The display control method includes the steps of:
and step 100, controlling the image sensor to acquire an image in a preset area range relative to the body.
In this embodiment, the electronic device 50 may be a smart speaker. The cross-section of the body 580 is a circle with a radius r, and the radius r is the distance from the center o to the outer side wall of the body 580. In this embodiment, the display 582 may be disposed along the circumference of the body 580, and the display 582 is disposed on the entire circumference of the body 580, i.e., the length of the display 582 is equal to the circumference 2 π r of the body 580. In other embodiments, the display 582 is disposed on a portion of the circumference of the body 580, i.e., the length of the display 582 is less than the circumference 2 π r of the body 580.
The body 580 is provided with a plurality of image sensors 584, and each image sensor 584 is configured to acquire an image within a preset area relative to the body 580.
The image sensor 584 may be a camera. The image sensor 584 may have a certain photographing viewing angle. In the present embodiment, the photographing angle of view is a horizontal photographing angle of view. In the present embodiment, the photographing angle of view of the image sensor 584 may be 120 °, and the image sensor 584 may be a fisheye camera. Since the peripheral scene of the body 580 has 360 °, in order to photograph all the peripheral scenes of the body 580, the peripheral scene of the body 580 may be divided or set into a plurality of preset area ranges, for example, 3 preset area ranges, where each preset area range is 120 degrees, and each preset area range corresponds to an area between two view angle edge lines.
Fig. 3 is a schematic diagram illustrating a position of an image sensor according to an embodiment of the present application. The peripheral scene of the body 580 includes a first preset area range, a second preset range and a third preset range. In this embodiment, the angle range value of each predetermined area range is set in a counterclockwise direction with one of the viewing angle borderlines oA being a starting edge of 0 °. For example, the first predetermined area range includes the area between the view angle edge oA and the view angle edge oB, and the corresponding angle range value [0 °, 120 °), the second predetermined area range includes the area between the view angle edge oB and the view angle edge oC, and the corresponding angle range value [120 °, 240 °), and the third predetermined area range includes the area between the view angle edge oC and the view angle edge oA, and the corresponding angle range value [240 °, 360 °). The view angle sideline oA intersects the display 582 attached to the outer sidewall of the body 580 at a point H, the view angle sideline oB intersects the display 582 attached to the outer sidewall of the body 580 at a point E, and the view angle sideline oC intersects the display 582 attached to the outer sidewall of the body 580 at a point F. In this embodiment, one image sensor 584 is disposed in each preset area, that is, three image sensors 584 may be disposed in the electronic device 50 to obtain images of all scenes around the periphery of the body 580. In other embodiments, the angular range value for each preset area range may be determined by other equations. For example, the viewing angle edge oA is set to 20 ° and the angle range value of each predetermined range is determined in the counterclockwise direction, at this time, the first predetermined range includes the range between the viewing angle edge oA and the viewing angle edge oB, and the corresponding angle range value [20 °, 140 °), the second predetermined range includes the range between the viewing angle edge oB and the viewing angle edge oC, and the corresponding angle range value [140 °, 260 °), and the third predetermined range includes the range between the viewing angle edge oC and the viewing angle edge oA, and the corresponding angle range value [260 °, 20 °). In other embodiments, the peripheral scene of the body 580 may also have 4 predetermined area ranges, that is, each predetermined area range is 90 °, in this case, the image sensor 584 may be a wide-angle camera, and thus, the image sensor 584 may determine the type of the camera according to the size of the angle range value of each predetermined area range.
In one embodiment, three image sensors 584 may be disposed at the center o. In another embodiment, the three image sensors 584 can be disposed above the center o through a specific structure, such as a straight line passing through the center o and aligned with the height direction X of the body 580, or disposed at a position above the circle where the center o is located and the distances from each image sensor to the center o are equal, or disposed on a concentric circle concentric with the center o and having a radius less than or equal to r.
In this embodiment, one image sensor is disposed in each preset area range, and the shooting direction of the image sensor is consistent with the center line position of the preset area range in which the image sensor is located. Since each preset area range is 120 degrees, and the shooting angle of view of each image sensor is 120 degrees, the shooting angle of view of each image sensor can be overlapped with the corresponding preset area range. For example, the three image sensors 584 may include a first image sensor, a second image sensor, and a third image sensor. The first image sensor can be arranged at the center of a circle and can acquire AoB images in a first preset area range, the middle position of the first preset area range is 60 degrees, and the shooting direction of the first camera is also 60 degrees; the second image sensor can be arranged at the circle center and can acquire an image of the BoC in a second preset area range, the middle position of the second preset area range is 180 degrees, and the shooting direction of the second camera is also 180 degrees; the third image sensor can be arranged at the center of the circle and can acquire an image of CoA in a third preset area range, the middle position of the third preset area range is 300 degrees, and the shooting direction of the third camera is also 300 degrees.
In one embodiment, the image sensor 584 may be disposed along a circumference of the body 580, such as disposing an image sensor 584 within each predetermined area. For example, the image sensor 584 is disposed at an intersection D of the center line oR within the first preset region range AoB and the display screen 582. Since the radius r of the body 580 is not much influenced by the calculation result relative to the distance DR that the image sensor can photograph or the distance between the user and the electronic device, or the image sensor disposed at the point D and the image sensor disposed at the center o can achieve substantially the same effect, that is, the image sensor disposed at the point D and the image sensor disposed at the center o can obtain the image within the first predetermined area, which is considered to be equivalent, and the influence on the subsequent calculation result can be ignored, the image sensor can also be disposed above the body 580.
Fig. 4 is a schematic view of a shooting angle of an image sensor according to an embodiment of the present disclosure. When the image sensor 584 is disposed on the outer sidewall of the body 580, if the shooting angle of the image sensor is the same as the preset area range where the image sensor is located, a certain blind area may exist during shooting. Therefore, when the image sensor 584 is disposed on the outer sidewall of the body 580, the shooting angle of the image sensor 584 can be adjusted to be larger than the predetermined area range. For example, the photographing angle of view of the image sensor is set to 160 °, and the first predetermined area range may be 120 °, wherein the first predetermined area range is located between the angle of view sideline oA and the angle of view sideline oB, and the photographing angle of view of the image sensor 584 may be represented as an angle corresponding to an area between the sideline DU and the sideline DV.
Step 102, judging whether the image contains human body features; if yes, go to step 104, otherwise, go back to step 100.
Fig. 5 is a schematic view of an image captured by an image sensor according to an embodiment of the present disclosure. In this embodiment, an image captured by an image sensor located in a first preset area is shown, and the image includes a human body feature S'. By detecting the human body characteristics of the image, when the human body characteristics are identified in the image, the image is judged to contain the human body characteristics; when the human body features are not identified in the image, the image is judged not to contain the human body features. In this embodiment, the human body feature may include a human face, that is, it may also be determined whether the image includes a human face feature to determine to execute step 104 or return to step 100.
And 104, acquiring position information of the user corresponding to the human body characteristics within the preset area range.
In this embodiment, the position information may be an area position angle of the user within the preset area range, where the area position angle is an included angle between a point (a user viewing position) where the user is located and a viewing angle sideline in the preset area range.
Referring to fig. 3, it is assumed that the user is located at a point S in the first predetermined area range AoB, and the image obtained by the first image sensor includes the human body feature S'. In this embodiment, the included angle between the point S and the viewing angle sideline oB can be expressed as an area position angle γ, which is ≈ BoS, and γ ∈ [0, 120 °), where the point S where the user is located is on a straight line AB perpendicular to the lens centerline OR. Δ AoB formed by viewing angle sidelines oA and oB is an isosceles triangle with an apex angle of 120 °, wherein ═ RoB is 60 °, Ao ═ Bo, and the area formula according to the isosceles triangle is:
Figure PCTCN2018121834-APPB-000001
simplified by equation (1):
Figure PCTCN2018121834-APPB-000002
therefore, the area position angle γ can be calculated from the ratio between the line segments AS and SB.
Referring to fig. 4, the human body feature S ' in the image corresponds to the user at the point S, and since the shooting angle of view of the image sensor coincides with the predetermined area range, the angle of view sideline oA may correspond to the first side a ' and the angle of view sideline oB may correspond to the second side B ' of the image width, so that:
Figure PCTCN2018121834-APPB-000003
as can be seen from the formulas (2) and (3), the region position angle γ can be calculated according to the division ratio of the human body feature S ' on the image width a ' B ' in the image.
In one embodiment, the region position angle may be an included angle between the point S and the viewing angle edge oA, such as represented by ≈ AoS, where:
formula of angle AoS (angle AoB-angle BoS) (4)
Therefore, when the value of the region position angle γ is calculated, another expression of the region position angle may also be calculated according to the formula (4). Since the viewing position of the user relative to the ontology is determined, both of the above two representations of the position information can obtain the same viewing position of the user at the corresponding ontology.
In an embodiment, a first distance a 'S' from the first side of the human body feature to the first side of the image may be obtained, and a second distance S 'B' from the second side of the human body feature to the second side of the image may be obtained, and then the position information γ may be calculated according to a ratio of the first distance a 'S' to the second distance S 'B'.
In one embodiment, the position of the center point of the human body feature (e.g., the position where the two dashed lines perpendicularly intersect in fig. 4) can be obtained. The central point position of the human body feature is represented as the central point of the geometric position corresponding to the human body feature in the image. Thus, the distance between the center point and the first side can be used as the first distance, and the distance between the center point and the second side can be used as the second distance.
In the present embodiment, when the image sensor 584 is disposed on the center of a circle, the image sensor 584 may obtain images corresponding to all scenes within a preset area range. When the image sensor 584 may be disposed in a circumferential direction of the body 580, the image sensor may obtain an image corresponding to a portion of the scene within a preset area. Because the lens center line of the image sensor is superposed with the center line position of the preset area range, when the image of a part of the scene in the preset area range is obtained, the position information of the user in the preset area range can be calculated according to the formula.
Referring to fig. 4, in an embodiment, the image sensor may be a depth camera, and when the image sensor is disposed at a point D on an outer sidewall of the body 580, the image sensor may obtain a distance SD between the points S and D. In addition, a depth camera can be arranged at the center o of the body 580, and can be used for obtaining the distance So between the point S where the user is located and the center o, and since the distance between the point D and the center o is the radius r, for the triangle SoD, the side SD, the side So, and the side Do of the triangle SoD can be determined, and then the ═ SoD can be calculated according to the cosine theorem of the triangle. Since the image sensor is located at the center line position of the first preset region range, the angle BoD is 60 °, that is, the region position angle γ ═ BoD ═ SoD, and thus, when the shooting angle of the image sensor is greater than the preset region range, the value of the region position angle can also be acquired. In other embodiments, the area position angle may also be obtained by other methods, which is not limited in this application.
And 106, determining the visual range of the user relative to the display screen according to the position information, and controlling the display screen to display the picture on the area of the display screen corresponding to the visual range.
For the display screen attached to the cylindrical body, the horizontal viewing angle alpha of the two eyes of the user is 188 degrees at most, and the horizontal viewing angle alpha of the two eyes of the user coincident is 124 degrees. In this embodiment, assuming that the horizontal viewing angle α of the user is 180 °, the center position of the display on the display screen 582 in the region corresponding to the visual range may be aligned with the user, where the visual range may be represented as:
(γ+α/2,γ-α/2)。
for example, when α is 180 °, the visual range can be expressed as: (gamma +90 DEG, gamma-90 DEG). In this way, the electronic device may control to display a corresponding picture on an area of the display screen corresponding to the visual range.
In one embodiment, since the radius of the body 580 is r and the circumference l of the body 580 is l ═ 2 π r, displaying a frame on the area of the display screen corresponding to the visual range may mean displaying a frame with a preset length on the display screen, where the preset length range is (T, I). The frame with the preset length can be determined by the visual range, where T represents a starting value of the preset length, I represents an ending value of the preset length, and the preset length can be expressed as:
Figure PCTCN2018121834-APPB-000004
in an embodiment, after the area position angle value is determined, the viewing position angle of the peripheral scene of the user on the body 580 may be determined according to the angle range value of the preset area range where the image sensor is located. For example, when the user views at the point S with the radius r as the starting edge of the viewing angle edge oA being 0 ° and sets the angle range value of each preset area range in the counterclockwise direction, since the area position angle γ is calculated, the angle range value of the first preset area range [0, 120 °) is obtained, and therefore, if the area position angle γ is 20 °, the viewing position angle of the user at the peripheral scene of the body 580 at the point S can be calculated to be 100 °. In other embodiments, if the image sensor is located in the second predetermined area range, since the angle value range of the second predetermined area range is [120 °, 240 °), if the area position angle corresponding to the included angle between the user and the viewing angle sideline oC is 20 °, at this time, the viewing position angle of the user at the point S in the peripheral scene of the body 580 is 220 °. In other embodiments, when the starting point of the viewing angle edge oA is 20 ° and the angle range value of each preset region range is set in the counterclockwise direction, the angle range value of the first preset region range [20 °, 140 °), if the region position angle γ is 20 °, the viewing position angle of the user at the peripheral scene of the body 580 at the point S can be calculated to be 120 °.
Therefore, when γ is 20 °, it can be calculated that the viewing position angle of the peripheral scene of the user at the body 580 at the point S is 100 °, and at this time, the viewing position angle of the peripheral scene of the user at the body 580 corresponds to the visual range of (10 °, 190 °). The preset length corresponding to the viewing position angle of the peripheral scene of the user located at the body 580 at the point S may be represented as:
Figure PCTCN2018121834-APPB-000005
according to the display control method, the position information of the user relative to the preset area range is obtained, the visual range of the user relative to the display screen is determined according to the position information, and the picture is controlled to be displayed on the area of the display screen corresponding to the visual range.
Referring to fig. 6, a flowchart illustrating steps of a display control method according to an embodiment of the present application is shown. The display control method includes the steps of:
and 200, controlling the image sensor to acquire an image in a preset area range relative to the body.
Step 202, judging whether the image contains human body features, if so, executing step 206; if not, go to step 204.
And step 204, executing a delay operation and returning to the step 200.
And when the image does not contain human body characteristics or other conditions, performing time delay operation, for example, after the preset time is prolonged, performing the control image sensor to acquire the image in the preset area range. In an embodiment, if the image does not include the human body feature, the delay time of the delay operation may be a first preset time.
And step 206, acquiring the position information of the user corresponding to the human body characteristics within the preset area range.
And step 208, determining the visual range of the user relative to the display screen according to the position information.
Step 210, judging whether an angle difference between an area corresponding to the visual range to be displayed on the display screen and an area of a current display picture of the display screen is greater than a preset angle value, if so, executing step 212; if not, the area of the current frame displayed on the display screen may not be adjusted, at this time, step 204 is executed to execute a delay operation, such as extending the second preset time, and then the step 200 is executed again after extending the second preset time.
In an embodiment, at the first time, the viewing position angle of the user located at the peripheral scene of the body 580 is 100 °, and the area of the currently displayed picture on the display screen of the electronic device may be determined to be (10 °, 190 °) according to fig. 3, that is, the first preset angle range is (10 °, 190 °); at the second time, if the viewing position angle of the peripheral scene of the body 580 at the position moved by the user is 130 °, at this time, it may be determined that the area corresponding to the visual range that needs to be displayed by the display screen of the electronic device is (30 °, 210 °), that is, the second preset angle range is (30 °, 210 °). Therefore, it can be determined that the angle difference between the area corresponding to the visual range that needs to be displayed on the display screen and the area of the currently displayed screen is 20 °. If the preset angle value is 10 degrees, it indicates that the angle difference between the area corresponding to the visual range displayed on the display screen and the area of the currently displayed picture is greater than the preset angle value, and at this time, the area of the currently displayed picture on the display screen needs to be adjusted. Therefore, a first preset angle range of an area where a current display picture on the display screen is located can be obtained first, a second preset angle range of the area which needs to be displayed on the display screen according to the position information control can also be obtained, and then whether the difference between the first preset angle range and the second preset angle range is larger than the preset angle value or not can be judged.
In an embodiment, a first center position of an area of a current display frame on the display screen may be obtained, a second center position of the area that needs to be displayed on the display screen according to the position information may be obtained, and then, whether an angle difference between the first center position and the second center position is greater than the preset angle value may be compared to determine. In this embodiment, the viewing position corresponding to the current display frame of the display screen may be represented according to the center position of the current display frame of the display screen. For example, when the display screen currently displays a preset angle range (10 ° and 190 °) of a picture, the first center position is 100 °; when the area displayed on the display screen is (30 degrees, 210 degrees) according to the position information, the second center position is 130 degrees. It can be seen that the angular difference between the first central position and the second central position is greater than the preset angular value by 10 °. At this time, it is necessary to adjust the area of the currently displayed screen on the display screen.
In an embodiment, when the angle difference between the area corresponding to the visual range displayed on the display screen and the area of the currently displayed picture is not greater than the preset angle value, it indicates that the user can still obtain better user experience for the currently displayed picture of the display screen.
And 212, controlling to display a picture on an area of the display screen corresponding to the visual range.
The display control method also determines whether to adjust the preset angle range of the current display picture by judging whether the value of the position of the current display picture of the adjustment display screen is larger than the preset angle value, so that when the position of the user in front of the electronic equipment is changed greatly, the position of the current display picture can be controlled timely to change along with the change of the position of the user, and the user experience is improved.
Referring to fig. 7, a hardware structure diagram of an electronic device according to an embodiment of the present application is shown. As shown in fig. 7, the electronic device may apply the above embodiments, and the electronic device 50 provided in this application is described below, where the electronic device 50 may further include a processor 500, a storage device 502, a display 504, and an image sensor 584, and a computer program (instruction) stored in the storage device 502 and executable on the processor 500, and the electronic device 50 may further include other hardware parts, such as a key, a communication device, and the like, which are not described herein again. The processor 500 may exchange data with the storage device 502, the display 504, and the image sensor 584 over the bus 506.
The Processor 500 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, which is the control center for the electronic device 50 and connects the various parts of the entire electronic device 50 using various interfaces and lines.
The storage device 502 may be used to store the computer programs and/or modules, and the processor 500 may implement various functions of the display control method by running or executing the computer programs and/or modules stored in the storage device 502 and calling data stored in the storage device 502. The storage device 502 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like. In addition, the storage device 502 may include a high speed random access memory device, and may also include a non-volatile storage device such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one piece of magnetic disk storage, a Flash memory device, or other volatile solid state storage device.
The display 504 may display a User Interface (UI) or a Graphical User Interface (GUI) including data of photos, videos, chat contents, and the like, and the display 504 may also serve as an input device and an output device, and the display may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) touch display, a flexible display, a three-dimensional (3D) touch display, an ink screen display, and the like.
The processor 500 executes a program corresponding to the executable program code by reading the executable program code stored in the storage device 502, so as to execute the display control method executed by the electronic device in any of the foregoing embodiments.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (20)

  1. A display control method is applied to electronic equipment, the electronic equipment is provided with a cylindrical body and a display screen arranged around the body, and the display control method is characterized by comprising the following steps:
    controlling an image sensor to acquire an image in a preset area range relative to the body;
    judging whether the image contains human body features or not;
    when the image contains the human body characteristics, acquiring position information of a user corresponding to the human body characteristics within the preset area range;
    and determining the visual range of the user relative to the display screen according to the position information, and controlling the display screen to display the picture on the area of the display screen corresponding to the visual range.
  2. The display control method according to claim 1, wherein the position information is a region position angle between a user and a viewing angle edge of the preset region range.
  3. The method according to claim 2, wherein the obtaining of the area position angle between the user and the view angle edge of the preset area range specifically comprises:
    acquiring a segmentation ratio of the human body features on the image width of the image;
    and calculating the region position angle according to the segmentation ratio.
  4. The display control method according to claim 3, wherein the obtaining of the segmentation ratio of the human body feature over the image width of the image specifically comprises:
    acquiring a first distance between the human body feature and a first side edge on the width of the image;
    acquiring a second distance between the human body characteristic and a second side edge on the width of the shot image, wherein the first side edge and the second side edge are arranged oppositely;
    and calculating the segmentation ratio according to the first distance and the second distance.
  5. The method according to claim 1, wherein the determining whether the image includes a human body feature specifically includes:
    and judging whether the image with the human face features is identified in the image.
  6. The display control method according to claim 1, wherein the area corresponding to the visual range is an area corresponding to 180 degrees centered on a user on the display screen.
  7. The display control method of claim 1, wherein the display screen is a flexible display screen that is attached around the body.
  8. The method as claimed in claim 1, wherein the determining a visual range of a user with respect to the display screen according to the position information and controlling a display on an area of the display screen corresponding to the visual range includes:
    acquiring the area of a current display picture on the display screen;
    judging whether the angle difference between the area corresponding to the visual range to be displayed on the display screen and the area of the current display picture is larger than a preset angle value or not;
    and if so, adjusting the area of the current display picture of the display screen to the area corresponding to the visual range.
  9. The method as claimed in claim 8, wherein the determining whether the angle difference between the area corresponding to the pre-vision range required to be displayed on the display screen and the area of the current display screen is greater than a preset angle value specifically comprises:
    acquiring a first preset angle range of an area where a current display picture on the display screen is located;
    acquiring a second preset angle range of an area which needs to be displayed on the display screen according to the position information;
    and judging whether the difference between the first preset angle range and the second preset angle range is greater than the preset angle value or not.
  10. The method as claimed in claim 8, wherein the determining whether the angle difference between the area corresponding to the pre-vision range required to be displayed on the display screen and the area of the current display screen is greater than a preset angle value specifically comprises:
    acquiring a first central position of an area where a current display picture on the display screen is located;
    acquiring a second central position of an area displayed on the display screen according to the position information;
    and judging whether the angle difference between the first center position and the second center position is larger than the preset angle value or not.
  11. An electronic device comprising a cylindrical body, the electronic device comprising:
    the image sensor is used for acquiring an image in a preset area range relative to the body;
    the display screen is arranged around the body;
    the processor is connected with the image sensor and the display screen and is used for judging whether the image contains human body characteristics; when the image contains the human body characteristics, the processor is used for acquiring the position information of the user corresponding to the human body characteristics within the preset area range; the processor is further used for determining a visual range of a user relative to the display screen according to the position information and controlling a picture to be displayed on an area of the display screen corresponding to the visual range.
  12. The electronic device of claim 11, wherein the location information is a zone location angle between a user and a view angle edge of the preset zone range.
  13. The electronic device of claim 12, wherein the processor is configured to obtain a segmentation ratio of the human feature over an image width of the image, and wherein the processor is further configured to calculate the region position angle based on the segmentation ratio.
  14. The electronic device of claim 13, wherein the processor is configured to obtain a first distance of the human feature from a first side edge of the image width; the human body characteristic distance acquiring device is also used for acquiring a second distance between the human body characteristic and a second side edge on the width of the shot image, wherein the first side edge and the second side edge are oppositely arranged; the processor is also configured to calculate the segmentation ratio value according to the first distance and the second distance.
  15. The electronic device of claim 11, wherein the area corresponding to the visual range is an area corresponding to 180 degrees centered on a user on the display screen.
  16. The electronic device according to claim 11, wherein the processor is configured to obtain an area of a currently displayed picture on the display screen, and determine whether an angle difference between an area corresponding to the visual range that needs to be displayed on the display screen and the area of the currently displayed picture is greater than a preset angle value; and when the angle difference between the area corresponding to the visual range required to be displayed on the display screen and the area of the current display picture is larger than a preset angle value, the processor is used for adjusting the area of the current display picture of the display screen to the area corresponding to the visual range.
  17. The electronic device of claim 16, wherein the processor is configured to obtain a first preset angle range of an area where a current display screen on the display screen is located, and further configured to obtain a second preset angle range of an area that needs to be displayed on the display screen according to the position information; the processor is further configured to determine whether a difference between the first preset angle range and the second preset angle range is greater than the preset angle value; when the difference between the first preset angle range and the second preset angle range is larger than the preset angle value, the processor is configured to adjust the area of the current display frame of the display screen to an area corresponding to the visual range.
  18. The electronic device of claim 16, wherein the processor is configured to obtain a first center position of an area on the display screen where a current display screen is located, and further configured to obtain a second center position of an area required to be displayed on the display screen according to the position information; the processor is further configured to determine whether an angle difference between the first center position and the second center position is greater than the preset angle value; when the angle difference between the first center position and the second center position is greater than the preset angle value, the processor is configured to adjust the area of the current display frame of the display screen to an area corresponding to the visual range.
  19. The electronic device of claim 18, wherein the display is a flexible display that fits around the body.
  20. A computer-readable storage medium storing computer instructions which, when executed by a processor, implement a display control method according to any one of claims 1 to 10.
CN201880097634.XA 2018-12-18 2018-12-18 Display control method, electronic device, and computer-readable storage medium Pending CN113168823A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/121834 WO2020124383A1 (en) 2018-12-18 2018-12-18 Display control method, electronic device and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN113168823A true CN113168823A (en) 2021-07-23

Family

ID=71100110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880097634.XA Pending CN113168823A (en) 2018-12-18 2018-12-18 Display control method, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN113168823A (en)
WO (1) WO2020124383A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101609660A (en) * 2008-06-18 2009-12-23 奥林巴斯株式会社 Digital frame, the information processing system control method of unifying
CN102063887A (en) * 2010-11-19 2011-05-18 天津三星电子显示器有限公司 Method for automatically adjusting visual range of display and display thereof
CN103049084A (en) * 2012-12-18 2013-04-17 深圳国微技术有限公司 Electronic device and method for adjusting display direction according to face direction
US20140152553A1 (en) * 2012-12-05 2014-06-05 Samsung Electronics Co., Ltd. Method of displaying content and electronic device for processing the same
CN104461442A (en) * 2014-12-30 2015-03-25 上海华勤通讯技术有限公司 Regional display method for flexible screen and terminal device
US20160313963A1 (en) * 2015-04-21 2016-10-27 Samsung Electronics Co., Ltd. Electronic device for displaying screen and control method thereof
WO2017121361A1 (en) * 2016-01-14 2017-07-20 深圳前海达闼云端智能科技有限公司 Three-dimensional stereo display processing method and apparatus for curved two-dimensional screen
CN107329570A (en) * 2017-06-29 2017-11-07 合肥步瑞吉智能家居有限公司 A kind of method that display device angle is automatically adjusted according to individual beholder position
CN107636696A (en) * 2016-06-16 2018-01-26 深圳市柔宇科技有限公司 Multiusers interaction method, device and robot of accompanying and attending to
CN108549528A (en) * 2018-03-30 2018-09-18 努比亚技术有限公司 A kind of display methods of terminal, wearable terminal and computer readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108431759A (en) * 2016-11-21 2018-08-21 深圳市柔宇科技有限公司 Electronic device and its display control method
CN108064371A (en) * 2016-12-27 2018-05-22 深圳市柔宇科技有限公司 A kind of control method and device of flexible display screen

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101609660A (en) * 2008-06-18 2009-12-23 奥林巴斯株式会社 Digital frame, the information processing system control method of unifying
CN102063887A (en) * 2010-11-19 2011-05-18 天津三星电子显示器有限公司 Method for automatically adjusting visual range of display and display thereof
US20140152553A1 (en) * 2012-12-05 2014-06-05 Samsung Electronics Co., Ltd. Method of displaying content and electronic device for processing the same
CN103049084A (en) * 2012-12-18 2013-04-17 深圳国微技术有限公司 Electronic device and method for adjusting display direction according to face direction
CN104461442A (en) * 2014-12-30 2015-03-25 上海华勤通讯技术有限公司 Regional display method for flexible screen and terminal device
US20160313963A1 (en) * 2015-04-21 2016-10-27 Samsung Electronics Co., Ltd. Electronic device for displaying screen and control method thereof
WO2017121361A1 (en) * 2016-01-14 2017-07-20 深圳前海达闼云端智能科技有限公司 Three-dimensional stereo display processing method and apparatus for curved two-dimensional screen
CN107636696A (en) * 2016-06-16 2018-01-26 深圳市柔宇科技有限公司 Multiusers interaction method, device and robot of accompanying and attending to
CN107329570A (en) * 2017-06-29 2017-11-07 合肥步瑞吉智能家居有限公司 A kind of method that display device angle is automatically adjusted according to individual beholder position
CN108549528A (en) * 2018-03-30 2018-09-18 努比亚技术有限公司 A kind of display methods of terminal, wearable terminal and computer readable storage medium

Also Published As

Publication number Publication date
WO2020124383A1 (en) 2020-06-25

Similar Documents

Publication Publication Date Title
US11317022B2 (en) Photographing apparatus for photographing panoramic image using visual elements on a display, and method thereof
US11276183B2 (en) Relocalization method and apparatus in camera pose tracking process, device, and storage medium
RU2624569C2 (en) Image displaying method and device
WO2016134534A1 (en) Method for automatically adjusting camera and electronic device
US9692977B2 (en) Method and apparatus for adjusting camera top-down angle for mobile document capture
US10051180B1 (en) Method and system for removing an obstructing object in a panoramic image
US20210084228A1 (en) Tracking shot method and device, and storage medium
US20130222363A1 (en) Stereoscopic imaging system and method thereof
US11381738B2 (en) Method for a mobile device to photograph a panoramic image, mobile device, and computer readable storage medium and computer product
US9921054B2 (en) Shooting method for three dimensional modeling and electronic device supporting the same
CN111163303B (en) Image display method, device, terminal and storage medium
CN106815809B (en) Picture processing method and device
WO2010026696A1 (en) Image processing device, image processing method, image processing program, and imaging device
US10979700B2 (en) Display control apparatus and control method
CN105592267A (en) Shooting control method, shooting controlling device and shooting system
TW202016880A (en) Image stitching processing method and system thereof
CN112150560A (en) Method and device for determining vanishing point and computer storage medium
US11770603B2 (en) Image display method having visual effect of increasing size of target image, mobile terminal, and computer-readable storage medium
WO2018014517A1 (en) Information processing method, device and storage medium
US10931926B2 (en) Method and apparatus for information display, and display device
US20150095824A1 (en) Method and apparatus for providing user interface according to size of template edit frame
WO2023072030A1 (en) Automatic focusing method and apparatus for lens, and electronic device and computer-readable storage medium
CN113168823A (en) Display control method, electronic device, and computer-readable storage medium
CN113330413A (en) Display control method, electronic device, and computer-readable storage medium
CN115002338B (en) Shooting parameter control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210723

WD01 Invention patent application deemed withdrawn after publication