CN113643438A - Control method, control device, control equipment, target equipment and storage medium - Google Patents

Control method, control device, control equipment, target equipment and storage medium Download PDF

Info

Publication number
CN113643438A
CN113643438A CN202010345363.4A CN202010345363A CN113643438A CN 113643438 A CN113643438 A CN 113643438A CN 202010345363 A CN202010345363 A CN 202010345363A CN 113643438 A CN113643438 A CN 113643438A
Authority
CN
China
Prior art keywords
information
target device
position information
user
relative position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010345363.4A
Other languages
Chinese (zh)
Inventor
苑屹
王云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN202010345363.4A priority Critical patent/CN113643438A/en
Publication of CN113643438A publication Critical patent/CN113643438A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a control method, a control device, a target device and a storage medium. The method comprises the following steps: obtaining a foreground image; determining relative position information of the target device relative to the user based on the foreground image; and adjusting the position of the target device based on the relative position information so that the angle between the target device and the user is within a set range. By the method, the target equipment can be effectively controlled when the user is inconvenient to speak and manually control, so that the angle between the target equipment and the user is in a set range, the user can conveniently watch the target equipment, and the use experience of the target equipment is improved.

Description

Control method, control device, control equipment, target equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of control, in particular to a control method, a control device, control equipment, target equipment and a storage medium.
Background
Along with the development of science and technology, intelligent equipment is gradually popularized. The intelligent device is a product formed by introducing a microprocessor, a sensor technology and a network communication technology into terminal equipment (such as household appliances or advertisement screens).
When controlling the intelligent device, the control of the intelligent household appliance is usually realized by voice control and by operating an operation interface of the intelligent device. However, how to realize the control of the intelligent household electrical appliance when the user is inconvenient to speak and manually control is an urgent technical problem at present.
Disclosure of Invention
The embodiment of the invention provides a control method, a control device, a target device and a storage medium, and the method is used for realizing the control of intelligent household appliances when a user is inconvenient to speak and manually control.
In a first aspect, an embodiment of the present invention provides a control method, applied to a control device, including:
obtaining a foreground image;
determining relative position information of the target device relative to the user based on the foreground image;
and adjusting the position of the target device based on the relative position information so that the angle between the target device and the user is within a set range.
Further, the foreground image includes one or more of a target device gazed by a user and an identifier of the target device, wherein the target device gazed by the user is determined based on the gaze information of the user.
Further, the determining the relative position information of the target device relative to the user based on the foreground image comprises:
when the foreground image comprises at least two markers, determining first image information of the markers contained in the foreground image, wherein the first image information comprises one or more of position information, size information and shape information of the markers, and the first image information is the image information of the markers contained in the foreground image;
determining actual pose information of the marker contained in the foreground image;
determining relative position information of the target device with respect to a user based on the first image information and the actual pose information.
Further, the determining the relative position information of the target device relative to the user based on the foreground image comprises:
determining second image information of the target device in the foreground image, wherein the second image information is image information of the target device in the foreground image;
and comparing the second image information with the images of the poses of the models corresponding to the target equipment, and determining the relative position information of the target equipment relative to the user.
Further, the adjusting the position of the target device based on the relative position information includes:
and sending the relative position information to the target equipment to control the target equipment to adjust the position based on the relative position information.
In a second aspect, an embodiment of the present invention provides a control method, applied to a target device, including:
acquiring relative position information sent by control equipment, wherein the relative position information is position information of the target equipment relative to a user;
and adjusting the position based on the relative position information so that the angle between the target device and the user is within a set range.
Further, after the position adjustment is performed based on the relative position information, the method further includes:
acquiring environment information, wherein the environment information comprises illumination information and sound information;
adjusting a brightness parameter and a volume parameter of the target device based on the environmental information.
In a third aspect, an embodiment of the present invention provides a control apparatus, configured to a control device, including:
the acquisition module is used for acquiring a foreground image;
a determining module for determining relative position information of the target device relative to the user based on the foreground image;
and the adjusting module is used for adjusting the position of the target equipment based on the relative position information so as to enable the angle between the target equipment and the user to be within a set range.
In a fourth aspect, an embodiment of the present invention provides a control apparatus, configured to a target device, including:
the acquisition module is used for acquiring relative position information sent by the control equipment, wherein the relative position information is the position information of the target equipment relative to a user;
and the adjusting module is used for adjusting the position based on the relative position information so as to enable the angle between the target device and the user to be within a set range.
In a fifth aspect, an embodiment of the present invention provides a control apparatus, including:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method according to the first aspect of the invention.
In a sixth aspect, an embodiment of the present invention provides a target device, including:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method according to the second aspect of the invention.
In a seventh aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method provided by the embodiment of the present invention.
The embodiment of the invention provides a control method, a control device, control equipment, target equipment and a storage medium, wherein the scheme comprises the steps of firstly, acquiring a foreground image; secondly, determining relative position information of the target equipment relative to the user based on the foreground image; and finally, adjusting the position of the target device based on the relative position information so as to enable the angle between the target device and the user to be within a set range. By means of the scheme, when the user speaks inconveniently and controls the target device inconveniently and manually, the target device is effectively controlled, the angle between the target device and the user is within a set range, the user can watch the target device conveniently, and the use experience of the target device is improved.
Drawings
Fig. 1 is a schematic flowchart of a control method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a control method according to a second embodiment of the present invention;
fig. 3 is a schematic flowchart of a control method according to a third embodiment of the present invention;
fig. 4 is a schematic flowchart of a control method according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a control device according to a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of a control device according to a sixth embodiment of the present invention;
fig. 7 is a schematic structural diagram of a control device according to a seventh embodiment of the present invention;
fig. 8 is a schematic structural diagram of a target device according to an eighth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like. In addition, the embodiments and features of the embodiments in the present invention may be combined with each other without conflict.
The term "include" and variations thereof as used herein are intended to be open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment".
Example one
Fig. 1 is a schematic flowchart of a control method according to an embodiment of the present invention, where the method is applicable to a case of controlling a target device, and the method may be executed by a control apparatus provided in the present invention, where the apparatus may be implemented by software and/or hardware, and is generally integrated on a control device. The control device in this embodiment includes, but is not limited to: augmented Reality (AR) devices, such as glasses devices and AR eye movement devices. The control device can be used for controlling a target device, and the target device can be a device such as a television or an advertisement screen which can play content for being viewed by a user or other devices which can adjust the position and the orientation, such as an intelligent air conditioner, an intelligent robot and the like.
As shown in fig. 1, a control method according to a first embodiment of the present invention includes the following steps:
and S110, acquiring a foreground image.
The embodiment controls the target device through the control device, and can control the position of the target device when the target device is controlled, such as the distance and the angle of the target device relative to the user.
The control device may be regarded as a device for controlling the target device, and the control device may be an AR device. The foreground image can be obtained through a front camera of the control device. The foreground image may include objects in the user's field of view.
This step may trigger execution of the method described in this embodiment when the user has a need to control the target device, so as to control the target device based on the obtained foreground image. The triggering means is not limited, and the triggering means can be controlled by a physical key, can be controlled by voice, or can be controlled by watching information, wherein the watching information comprises watching duration, a watching point and/or a watching direction, and the watching information is the watching information of the user on the foreground image.
Illustratively, when the stay time of the gazing point is greater than a preset time threshold, the foreground image is shot by the foreground camera. And S120, determining the relative position information of the target device relative to the user based on the foreground image.
Relative position information may be understood as position information of the target device relative to the user. After the foreground image is obtained, the foreground image may be identified in this step to determine the target device and the relative position information of the target device with respect to the user. Specifically, the step may determine, based on the gazing information corresponding to the obtained foreground image, a target device gazed by the user in the foreground image, that is, a device at the position of the injection point in the foreground image is used as the target device.
Specifically, when determining the relative position information based on the foreground image, this step may decide a specific manner of determining the relative position information based on specific content included in the foreground image.
For example, when the target device is included in the foreground image, this step may determine the relative position information based on the target device included in the foreground image; when the identifier of the target device is included in the foreground image, relative location information is determined based on the identifier of the target device.
The identifier may include location information, such as coordinates, of the identifier in the actual scene relative to the target device. The control apparatus may determine the coordinates of the marker by recognizing the marker, such as constructing a coordinate system with the location of the target apparatus, determining the coordinates of the marker in the coordinate system, and storing the coordinates in the marker. Illustratively, the identifier may be a two-dimensional code, a bar code, or the like. At least two markers may be disposed around the target device, and each marker may have a unique identifier. Each marker may be a pattern such as a two-dimensional code or a barcode that can store position information of the marker, or may be a unique object, and the control device may prestore a relationship between the marker and the position information, and identify the marker to determine the position information thereof. In addition, the identifier may also store or correspond to the target device or other information of the identifier, such as identification information of the target device, coordinate information of the target device, and location information between identifiers.
In one embodiment, when determining the relative position information based on the target device in the foreground image, the image of each pose of the model corresponding to the target device may be matched with the 3D model database based on the pose information, i.e., the position information, the pose information, and the pose information, of the target device in the foreground image, and the position information corresponding to the adapted pose information may be determined as the relative position information of the target device and the user.
In one embodiment, in determining the relative position information based on the marker in the foreground image, the relative position information may be determined in the same manner as in the above example, i.e., in a manner of performing matching of the 3D model database.
In one embodiment, when determining the relative position information based on the marker in the foreground image, the position information of the marker in the foreground image may be determined, and then the transformation matrix and the rotation matrix may be determined based on the position information of the marker in the actual scene, so as to determine the relative position information between the coordinate system of the foreground camera and the coordinate system of the target device in the actual scene. Specifically, in the step, the identifier may be obtained and identified as much as possible in the foreground image through an image algorithm, and then the rotation matrix and the translation matrix are determined by combining the position information of the identified identifier in the actual scene, so as to determine the relative position information.
Generally, if a sufficient number of markers are captured in the foreground camera, only the markers can be considered as one point in the foreground image, and in this case, since the corresponding relationship of each marker is known (the corresponding relationship of each marker can be pre-stored in the marker or the corresponding relationship of the marker and each marker is pre-stored in the control device), and parameters such as the focal length of the foreground camera are known, a group of rotation matrix and translation matrix which best conform to the current foreground image can be determined by using the position information of the marker in the foreground image and the position information of the real marker, so as to reflect the relative position relationship between the current two coordinate systems, thereby determining the relative position information of the target device relative to the user.
If the number of the markers is small, the markers themselves may need to be regarded as a plurality of points, for example, four corners of the markers may be considered as four points in space, and the like, that is, shape information or size information of the markers is utilized, and accordingly, position information of the four corners of the markers may be stored in the markers, or position information of the four corners of the markers may be pre-stored in the control device.
S130, based on the relative position information, the position of the target device is adjusted, so that the angle between the target device and the user is within a set range.
After the relative position information is determined, this step may control the target device based on the relative position information. Specifically, the step may directly send the relative position information to the target device, so that the target device performs position adjustment based on the relative position information until the angle between the target device and the user is within a set range, which may be considered as a range convenient for the user to better view the target device. If the target device is opposite to the user, the adjustment of the target device is completed; this step may also determine target location information of the target device, i.e., adjusted location information of the target device, directly based on the current location information of the user (e.g., obtained by a location sensor of the control device) and the relative location information. And then sending the target position information to the target equipment so as to enable the angle between the target equipment and the user to be within a set range.
The embodiment of the invention provides a control method, which comprises the steps of firstly, obtaining a foreground image; secondly, determining relative position information of the target equipment relative to the user based on the foreground image; and finally, based on the relative position information, the position of the target equipment is adjusted so that the angle between the target equipment and the user is within a set range.
Further, the foreground image includes one or more of a target device gazed by a user and an identifier of the target device, wherein the target device gazed by the user is determined based on the gaze information of the user.
The foreground image may include one or more identifiers in the target device and/or around the target device at which the user gazes. The periphery may be regarded as a set area of the distance target device, and the set area may be determined according to an actual scene, which is not limited herein.
Further, the determination of the target device watched by the user based on the gazing information of the user specifically includes:
acquiring gazing information of a user;
when the stay time of the point of regard of the user on the device is larger than a preset time threshold, the device is determined to be the target device watched by the user, namely the target device required to be controlled by the user. When a plurality of devices are included in the foreground image, the user is required to determine the target device to be controlled through the gaze information.
In acquiring the gaze information of the user, an eye tracking technique may be employed. Currently, the optical recording method is widely used for calculating gaze information by eye tracking: the method comprises the steps of recording the eye movement condition of a testee, namely a user, by using a camera or a video camera, acquiring an eye image reflecting the eye movement, and extracting eye features such as eye movement data from the acquired eye image for establishing a model of sight line/fixation point estimation. Among other things, ocular features may include, but are not limited to: pupil location, pupil shape, iris location, iris shape, eyelid location, canthus location, spot (also known as purkinje spot) location, and the like.
Among optical recording methods, the eye tracking method which is currently the mainstream is called pupil-cornea reflex method.
In addition to optical recording, there are other ways to achieve tracking including, but not limited to, the following:
1. the eye tracking device may be a MEMS micro-electro-mechanical system, for example comprising a MEMS infrared scanning mirror, an infrared light source, an infrared receiver.
2. In other embodiments, the eye tracking device may also be a contact/non-contact sensor (e.g., electrode, capacitive sensor) that detects eye movement by the capacitance between the eye and the capacitive plate.
3. In yet another embodiment, the eye tracking device may also be a myoelectric current detector, for example by placing electrodes at the bridge of the nose, forehead, ears or earlobe, detecting eye movements by the detected myoelectric current signal pattern.
The working principle of the pupil-cornea reflex method can be summarized as follows: acquiring an eye image; the gaze/fixation point is estimated from the eye image.
The hardware requirements for the pupillary-corneal reflex method may be:
(1) light source: generally, the infrared light source is used, because the infrared light does not affect the vision of eyes; and may be a plurality of infrared light sources arranged in a predetermined manner, such as a delta shape, a straight shape, etc.;
(2) an image acquisition device: such as an infrared camera device, an infrared image sensor, a camera or a video camera, etc.
The pupil-cornea reflection method can be implemented as follows:
part 1. eye image acquisition:
the light source irradiates the eye, the eye is shot by the image acquisition equipment, and the reflection point of the light source on the cornea, namely a light spot (also called a purkinje spot), is shot correspondingly, so that the eye image with the light spot is obtained.
Part 2. gaze/gaze point estimation:
when the eyeballs rotate, the relative position relationship between the pupil center and the light spots changes, and a plurality of eye images with the light spots correspondingly acquired reflect the position change relationship; and estimating the sight line/the fixation point according to the position change relation.
The target equipment is determined by using the gazing information, so that the user intention can be known more accurately, and misoperation is avoided; in addition, when the foreground image comprises a plurality of devices, the target device to be controlled can be determined only by eyes, and the user with inconvenient actions and sound production can conveniently operate and control the target device.
Example two
Fig. 2 is a flowchart illustrating a control method according to a second embodiment of the present invention, which is embodied on the basis of the first embodiment. In this embodiment, determining the relative position information of the target device with respect to the user based on the foreground image specifically includes: when the foreground image comprises at least two markers, determining first image information of the markers contained in the foreground image, wherein the first image information comprises one or more of position information, size information and shape information of the markers, and the first image information is the image information of the markers contained in the foreground image;
determining actual pose information of the marker contained in the foreground image;
determining relative position information of the target device with respect to a user based on the first image information and the actual pose information.
Further, adjusting the position of the target device based on the relative position information specifically includes:
and sending the relative position information to the target equipment to control the target equipment to adjust the position based on the relative position information.
As shown in fig. 2, a second embodiment of the present invention provides a control method, including the following steps:
and S210, acquiring a foreground image.
S220, when the foreground image comprises at least two markers, determining first image information of the markers contained in the foreground image.
After at least two markers are identified from the foreground image, the method can intercept the foreground image and determine first image information, wherein the first image information can be regarded as an image containing the markers in the foreground image. The pose information of each marker and the relative position information between each marker can be determined based on the first image information.
The first image information includes one or more of position information, size information, and shape information of the marker, and the first image information is image information of the marker included in the foreground image. The size information may be understood as information indicating the size of the marker. The shape information may be understood as information indicating the shape of the marker.
And S230, determining the actual pose information of the marker contained in the foreground image.
The actual pose information can be regarded as position information, size information, and posture information of the marker in the actual scene.
After the marker included in the foreground image is determined, the step can determine the actual pose information of the marker included in the foreground image, that is, the position information of the marker in the actual scene. The actual pose information may be determined by identifying the marker or may be pre-stored in the control device.
S240, determining relative position information of the target device relative to the user based on the first image information and the actual pose information.
This step may determine a translation matrix and a rotation matrix based on the position information and the actual pose information in the first image information, thereby determining the relative position information of the target device and the user.
And S250, sending the relative position information to the target equipment so as to control the target equipment to adjust the position based on the relative position information.
The second embodiment of the invention provides a control method, which embodies the operation of determining the relative position information of the target device relative to the user and the operation of adjusting the position of the target device. By the method, the position of the target equipment is effectively adjusted through the marker in the foreground image, and the experience of a user in using the target equipment is improved.
EXAMPLE III
Fig. 3 is a schematic flow chart of a control method according to a third embodiment of the present invention, which is embodied on the basis of the first embodiment of the present invention, and the determining, based on the foreground image, the relative position information of the target device with respect to the user specifically includes:
determining second image information of the target device in the foreground image, wherein the second image information is image information of the target device in the foreground image;
and comparing the second image information with the images of the poses of the models corresponding to the target equipment, and determining the relative position information of the target equipment relative to the user.
Further, adjusting the position of the target device based on the relative position information specifically includes:
and sending the relative position information to the target equipment to control the target equipment to adjust the position based on the relative position information.
As shown in fig. 3, a third embodiment of the present invention provides a control method, including the following steps:
and S310, acquiring a foreground image.
S320, determining second image information of the target device in the foreground image, wherein the second image information is the image information of the target device in the foreground image.
When the target device is extracted from the foreground image, this step may extract an image determined to include the target device, that is, the second image information, from the foreground image. And controlling the target device based on the pose information of the target device in the second image information.
S330, comparing the second image information with the images of the poses of the models corresponding to the target equipment, and determining the relative position information of the target equipment relative to the user.
After the second image information is determined, a database containing a 3D model of the target device may be retrieved, and the pose information of the target device in the first image information may be matched by changing the pose of the 3D model of the target device. When the pose information of the 3D model of the target device in the 3D model database matches the pose information identified in the first image, i.e., the deviation is within the set range, the relative position information of the target device with respect to the user may be determined based on the position information of the current 3D model. The initial position of the 3D model may be considered as position information of the target device after the last adjustment, and if the target device is not adjusted, the initial position may be initial position information of the target device.
In addition, when the fitting is performed through the 3D model, in order to further improve the accuracy of the fitting result, the fitting result may also be determined in combination with information contained in the marker in the foreground image. The corresponding identifier may also include orientation information of the identifier with respect to the target device, i.e. information indicating the orientation of the identifier with respect to the target device, such as left, right, or above.
S340, sending the relative position information to the target equipment to control the target equipment to adjust the position based on the relative position information.
The third embodiment of the invention provides a control method, which embodies the operation of determining relative position information and the condition of adjusting target equipment, effectively adjusts the position of the target equipment in an actual scene through the target equipment in a foreground image, and improves the use experience of the target equipment.
Example four
Fig. 4 is a flowchart of a control method according to a fourth embodiment of the present invention, where the method is applicable to a case of controlling a target device, and the method is executed by a control apparatus according to the present invention, where the apparatus may be implemented by software and/or hardware, and is generally integrated on the target device. The present embodiment is not described in detail with reference to the above embodiments.
As shown in fig. 4, a fourth embodiment of the present invention provides a control method, including:
s410, obtaining relative position information sent by the control equipment, wherein the relative position information is the position information of the target equipment relative to the user.
Adjustment of the target device location may be achieved based on the relative location information.
And S420, adjusting the position based on the relative position information so as to enable the angle between the target device and the user to be within a set range.
After the target device obtains the relative position information, the position of the target device may be adjusted based on the relative position information, for example, the current position information of the target device is adjusted based on the current position information and the relative position information of the target device, so that the angle between the target device and the user is within the setting range.
The fourth embodiment of the invention provides a control method, which comprises the steps of firstly obtaining relative position information sent by control equipment, wherein the relative position information is position information of target equipment relative to a user; and then adjusting the position based on the relative position information so that the angle between the target device and the user is within a set range. By using the method, when the user is inconvenient to speak and manually control, the target device is effectively controlled, so that the angle between the target device and the user is in a set range, the user can conveniently watch the target device, and the use experience of the target device is improved.
Further, after the position adjustment is performed based on the relative position information, the method further includes:
acquiring environment information, wherein the environment information comprises illumination information and sound information;
adjusting a brightness parameter and a volume parameter of the target device based on the environmental information.
The context information may be understood as information in the context in which the target device is located. The illumination information may be considered as data of illumination. The sound information may be considered as data of sound in the environment. The brightness of the target device can be adjusted by adjusting the brightness parameter, and the volume of the target device can be adjusted by adjusting the volume parameter.
The manner of acquiring the environment information is limited based on the specific content of the environment information, and is not limited here.
After the environment information is acquired, the brightness parameter and the volume parameter of the program guide equipment can be adjusted based on the environment information. The corresponding relationship between the different environment information and the brightness parameter and the volume information is not limited herein and can be determined according to the usage habit of the user. According to the invention, on the basis of realizing the automatic adjustment of the target equipment, the brightness parameter and the volume parameter are automatically adjusted through the environmental parameters, so that the adjusted target equipment can be ensured to provide the optimal playing effect for the user.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a control apparatus according to a fifth embodiment of the present invention, where the apparatus is applicable to a case of controlling a target device, where the apparatus may be implemented by software and/or hardware and is generally configured on a control device.
As shown in fig. 5, the apparatus includes:
an obtaining module 51, configured to obtain a foreground image;
a determining module 52 for determining relative position information of the target device with respect to the user based on the foreground image;
and an adjusting module 53, configured to adjust a position of the target device based on the relative position information, so that an angle between the target device and the user is within a set range.
In the present embodiment, the foreground image is obtained by the obtaining module 51; determining, by the determination module 52, relative location information of the target device with respect to the user based on the foreground image; and adjusting the position of the target device by an adjusting module 53 based on the relative position information so that the angle between the target device and the user is within a set range.
The embodiment provides a control device, can be when the inconvenient manual control of inconvenient speaking of user, effectually control the target device for the user of being convenient for watches the target device, has promoted the target device and has used experience in target device and user's angle in setting for the within range.
Further, the foreground image includes one or more of a target device gazed by a user and an identifier of the target device, wherein the target device gazed by the user is determined based on the gaze information of the user.
Further, the determining module 52 specifically includes:
when the foreground image comprises at least two markers, determining first image information of the markers contained in the foreground image, wherein the first image information comprises one or more of position information, size information and shape information of the markers, and the first image information is the image information of the markers contained in the foreground image;
determining actual pose information of the marker contained in the foreground image;
determining relative position information of the target device with respect to a user based on the first image information and the actual pose information.
Further, the determining module 52 specifically includes:
determining second image information of the target device in the foreground image, wherein the second image information is image information of the target device in the foreground image;
and comparing the second image information with the images of the poses of the models corresponding to the target equipment, and determining the relative position information of the target equipment relative to the user.
Further, the adjusting module 53 specifically includes:
and sending the relative position information to the target equipment to control the target equipment to adjust the position based on the relative position information.
EXAMPLE six
Fig. 6 is a schematic structural diagram of a control apparatus according to a sixth embodiment of the present invention, where the apparatus is applicable to a case of controlling a target device, where the apparatus may be implemented by software and/or hardware and is generally configured on the target device.
As shown in fig. 6, the apparatus includes:
an obtaining module 61, configured to obtain relative location information sent by a control device, where the relative location information is location information of the target device relative to a user;
and an adjusting module 62, configured to perform position adjustment based on the relative position information, so that an angle between the target device and the user is within a set range.
The apparatus provided in this embodiment obtains, by an obtaining module 61, relative position information sent by a control device, where the relative position information is position information of the target device relative to a user; and adjusting the position based on the relative position information through an adjusting module 62 so that the angle between the target device and the user is within a set range.
The control device provided by the embodiment can effectively control the target equipment when a user does not speak conveniently and does not have manual control conveniently, so that the angle between the target equipment and the user is in a set range, the user can watch the target equipment conveniently, and the use experience of the target equipment is improved.
Further, after the position adjustment is performed based on the relative position information, the method further includes:
acquiring environment information, wherein the environment information comprises illumination information and sound information;
adjusting a brightness parameter and a volume parameter of the target device based on the environmental information.
EXAMPLE seven
Fig. 7 is a schematic structural diagram of a control device according to a seventh embodiment of the present invention, and as shown in fig. 7, the seventh embodiment of the present invention provides a control device including: one or more processors 71 and storage 72; the processor 71 in the control device may be one or more, and fig. 7 illustrates one processor 71; the storage device 72 is used to store one or more programs; the one or more programs are executed by the one or more processors 71, so that the one or more processors 71 implement the control method according to any one of the first, second, or third embodiments of the present invention.
The control apparatus may further include: an input device 73, an output device 74 and a communication device 75, the communication device 75 for communicating with a target apparatus. The communication device 75 is connected to one or more processors 71.
The processor 71, the storage device 72, the input device 73, the output device 74 and the communication device 75 in the control apparatus may be connected by a bus or other means, and the connection by the bus is exemplified in fig. 7.
The storage device 72 in the control apparatus is used as a computer-readable storage medium for storing one or more programs, which may be software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the control method provided in the first, second, or third embodiment of the present invention (for example, the modules in the control device shown in fig. 5 include the obtaining module 51, the determining module 52, and the adjusting module 53). The processor 71 executes various functional applications of the control device and data processing by executing software programs, instructions and modules stored in the storage device 72, that is, implements the control method in the above-described method embodiment.
The storage device 72 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the control apparatus, and the like. Further, the storage device 72 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the storage 72 may further include memory located remotely from the processor 71, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 73 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the control apparatus. The output device 74 may include a display device such as a display screen.
And, when one or more programs included in the above-described control apparatus are executed by the one or more processors 71, the programs perform the following operations:
obtaining a foreground image;
determining relative position information of the target device relative to the user based on the foreground image;
and adjusting the position of the target device based on the relative position information so that the angle between the target device and the user is within a set range.
Example eight
Fig. 8 is a schematic structural diagram of a target device according to an eighth embodiment of the present invention. As shown in fig. 8, an eighth embodiment of the present invention provides a target device, including: one or more processors 81 and storage 82; the processor 81 in the target device may be one or more, and one processor 81 is taken as an example in fig. 8; the storage 82 is used to store one or more programs; the one or more programs are executed by the one or more processors 81, so that the one or more processors 81 implement the control method according to the fourth embodiment of the present invention.
The target device may further include: an input means 83, an output means 84 and a communication means 85, the communication means 85 being for communicating with the control device. The communication device 85 is connected to one or more processors 81.
The processor 81, the storage device 82, the input device 83, the output device 84, and the communication device 85 in the target apparatus may be connected by a bus or other means, and the bus connection is exemplified in fig. 8.
The storage device 82 in the target device serves as a computer-readable storage medium for storing one or more programs, which may be software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the control method provided in the first, second, or third embodiment of the present invention (for example, the modules in the control device shown in fig. 6 include the obtaining module 61 and the adjusting module 62). The processor 81 executes various functional applications and data processing of the target device by executing software programs, instructions and modules stored in the storage device 82, that is, implements the control method in the above-described method embodiment.
The storage 82 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the target device, and the like. Further, the storage 82 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the storage 82 may further include memory located remotely from the processor 81, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 83 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the target apparatus. The output device 84 may include a display device such as a display screen.
And, when the one or more programs included in the above-mentioned target device are executed by the one or more processors 81, the programs perform the following operations:
acquiring relative position information sent by control equipment, wherein the relative position information is position information of the target equipment relative to a user;
and adjusting the position based on the relative position information so that the angle between the target device and the user is within a set range.
Example nine
An embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, is configured to perform the control method provided by the present invention, and the method includes a method applied to a control device and a method applied to a target device.
The method applied to the control equipment comprises the following steps:
obtaining a foreground image;
determining relative position information of the target device relative to the user based on the foreground image;
and adjusting the position of the target device based on the relative position information so that the angle between the target device and the user is within a set range.
The method applied to the target device comprises the following steps:
acquiring relative position information sent by control equipment, wherein the relative position information is position information of the target equipment relative to a user;
and adjusting the position based on the relative position information so that the angle between the target device and the user is within a set range.
Optionally, the program may be further configured to perform the control method provided in any of the embodiments of the present invention when executed by the processor.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a flash Memory, an optical fiber, a portable CD-ROM, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. A computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take a variety of forms, including, but not limited to: an electromagnetic signal, an optical signal, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (12)

1. A control method is applied to a control device and comprises the following steps:
obtaining a foreground image;
determining relative position information of the target device relative to the user based on the foreground image;
and adjusting the position of the target device based on the relative position information so that the angle between the target device and the user is within a set range.
2. The method of claim 1, wherein the foreground image comprises one or more of a target device gazed by the user and an identifier of the target device, wherein the target device gazed by the user is determined based on the gaze information of the user.
3. The method of claim 2, wherein determining relative position information of a target device with respect to a user based on the foreground image comprises:
when the foreground image comprises at least two markers, determining first image information of the markers contained in the foreground image, wherein the first image information comprises one or more of position information, size information and shape information of the markers, and the first image information is the image information of the markers contained in the foreground image;
determining actual pose information of the marker contained in the foreground image;
determining relative position information of the target device with respect to a user based on the first image information and the actual pose information.
4. The method of claim 2, wherein determining relative position information of a target device with respect to a user based on the foreground image comprises:
determining second image information of the target device in the foreground image, wherein the second image information is image information of the target device in the foreground image;
and comparing the second image information with the images of the poses of the models corresponding to the target equipment, and determining the relative position information of the target equipment relative to the user.
5. The method of claim 1, wherein the adjusting the location of the target device based on the relative location information comprises:
and sending the relative position information to the target equipment to control the target equipment to adjust the position based on the relative position information.
6. A control method, applied to a target device, includes:
acquiring relative position information sent by control equipment, wherein the relative position information is position information of the target equipment relative to a user;
and adjusting the position based on the relative position information so that the angle between the target device and the user is within a set range.
7. The method of claim 6, further comprising, after performing position adjustment based on the relative position information:
acquiring environment information, wherein the environment information comprises illumination information and sound information;
adjusting a brightness parameter and a volume parameter of the target device based on the environmental information.
8. A control device, provided in a control device, comprising:
the acquisition module is used for acquiring a foreground image;
a determining module for determining relative position information of the target device relative to the user based on the foreground image;
and the adjusting module is used for adjusting the position of the target equipment based on the relative position information so as to enable the angle between the target equipment and the user to be within a set range.
9. A control device, which is provided in a target device, includes:
the acquisition module is used for acquiring relative position information sent by the control equipment, wherein the relative position information is the position information of the target equipment relative to a user;
and the adjusting module is used for adjusting the position based on the relative position information so as to enable the angle between the target device and the user to be within a set range.
10. A control apparatus, characterized by comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
11. A target device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 6-7.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202010345363.4A 2020-04-27 2020-04-27 Control method, control device, control equipment, target equipment and storage medium Pending CN113643438A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010345363.4A CN113643438A (en) 2020-04-27 2020-04-27 Control method, control device, control equipment, target equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010345363.4A CN113643438A (en) 2020-04-27 2020-04-27 Control method, control device, control equipment, target equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113643438A true CN113643438A (en) 2021-11-12

Family

ID=78415034

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010345363.4A Pending CN113643438A (en) 2020-04-27 2020-04-27 Control method, control device, control equipment, target equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113643438A (en)

Similar Documents

Publication Publication Date Title
CN109410285B (en) Calibration method, calibration device, terminal equipment and storage medium
CN108919958B (en) Image transmission method and device, terminal equipment and storage medium
US11917126B2 (en) Systems and methods for eye tracking in virtual reality and augmented reality applications
KR102056221B1 (en) Method and apparatus For Connecting Devices Using Eye-tracking
CN108681399B (en) Equipment control method, device, control equipment and storage medium
CN109032351B (en) Fixation point function determination method, fixation point determination device and terminal equipment
CN109976535B (en) Calibration method, device, equipment and storage medium
US20190377464A1 (en) Display method and electronic device
WO2019187487A1 (en) Information processing device, information processing method, and program
US11163994B2 (en) Method and device for determining iris recognition image, terminal apparatus, and storage medium
CN105306819A (en) Gesture-based photographing control method and device
CN114690900A (en) Input identification method, equipment and storage medium in virtual scene
CN113495613B (en) Eyeball tracking calibration method and device
CN112099615B (en) Gaze information determination method, gaze information determination device, eyeball tracking device, and storage medium
CN112651270A (en) Gaze information determination method and apparatus, terminal device and display object
CN113641238A (en) Control method, control device, terminal equipment, controlled equipment and storage medium
CN111610886A (en) Method and device for adjusting brightness of touch screen and computer readable storage medium
CN113643438A (en) Control method, control device, control equipment, target equipment and storage medium
CN110338750B (en) Eyeball tracking equipment
CN110018733A (en) Determine that user triggers method, equipment and the memory devices being intended to
CN112101064A (en) Sight tracking method, device, equipment and storage medium
CN113093907A (en) Man-machine interaction method, system, equipment and storage medium
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
CN112631424A (en) Gesture priority control method and system and VR glasses thereof
CN110334579B (en) Iris recognition image determining method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination