CN110673717A - Method and apparatus for controlling output device - Google Patents

Method and apparatus for controlling output device Download PDF

Info

Publication number
CN110673717A
CN110673717A CN201810720021.9A CN201810720021A CN110673717A CN 110673717 A CN110673717 A CN 110673717A CN 201810720021 A CN201810720021 A CN 201810720021A CN 110673717 A CN110673717 A CN 110673717A
Authority
CN
China
Prior art keywords
information
output
target
user
output information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810720021.9A
Other languages
Chinese (zh)
Inventor
包英泽
吴中勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201810720021.9A priority Critical patent/CN110673717A/en
Publication of CN110673717A publication Critical patent/CN110673717A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method and a device for controlling an output device. One embodiment of the method comprises: in response to detecting that the target user takes the article, acquiring position information of the target user; determining output equipment which meets preset conditions in the area where the target user is located as target output equipment based on the position information; and sending a control instruction for outputting the target information to the target output device. The embodiment realizes that the target output equipment can output the target information after the user takes the article.

Description

Method and apparatus for controlling output device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for controlling output equipment.
Background
The appearance of the unmanned supermarket not only enables consumers to shop more conveniently and quickly, but also greatly reduces the labor cost. Currently, information related to an article is generally displayed in an unmanned supermarket by means of price tags, posters and the like.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling an output device.
In a first aspect, an embodiment of the present application provides a method for controlling an output device, where the method includes: in response to detecting that the target user takes the article, acquiring position information of the target user; determining output equipment which meets preset conditions in the area where the target user is located as target output equipment based on the position information; and sending a control instruction for outputting the target information to the target output device.
In some embodiments, the method further comprises: acquiring article information of an article; determining output information associated with the article information in a preset first output information set, wherein the output information in the first output information set is associated with the article information; the determined output information is determined as target information.
In some embodiments, the method further comprises: acquiring user information of a target user; determining output information associated with the user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information; the determined output information is determined as target information.
In some embodiments, sending control instructions to output target information to a target output device includes: and sending a control instruction for outputting preset output information to the target output equipment.
In some embodiments, obtaining user information of the target user comprises: acquiring the age and the gender of a target user, and determining output information associated with user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information, and the method comprises the following steps: and determining output information associated with the age and the gender in a preset second output information set, wherein the output information in the second output information set is associated with the age and the gender.
In some embodiments, determining, as the target output device, an output device meeting a preset condition in an area where the target user is located based on the location information includes: and determining the output equipment which is positioned in the area where the target user is positioned and has the minimum distance with the target user as the target output equipment based on the position information.
In some embodiments, the method further comprises: and sending a control instruction for stopping outputting the target information to the target output equipment in response to detecting that the distance between the target user and the placement position before the object is taken is greater than a preset threshold value.
In a second aspect, an embodiment of the present application provides an apparatus for controlling an output device, where the apparatus includes: a position information acquisition unit configured to acquire position information of a target user in response to detection of a take of an article by the target user; an output device determination unit configured to determine, as a target output device, an output device that meets a preset condition in an area where a target user is located, based on the position information; a first transmission unit configured to transmit a control instruction to output the target information to the target output apparatus.
In some embodiments, the apparatus further comprises: an article information acquisition unit configured to acquire article information of an article; a first output information determination unit configured to determine output information associated with the item information in a preset first output information set, wherein the output information in the first output information set is associated with the item information; a first target information determination unit configured to determine the determined output information as target information.
In some embodiments, the apparatus further comprises: a user information acquisition unit configured to acquire user information of a target user; a second output information determination unit configured to determine output information associated with the user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information; a second target information determination unit configured to determine the determined output information as target information.
In some embodiments, the first sending unit is further configured to: and sending a control instruction for outputting preset output information to the target output equipment.
In some embodiments, the user information obtaining unit is further configured to: acquiring the age and the gender of a target user, and determining output information associated with user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information, and the method comprises the following steps: and determining output information associated with the age and the gender in a preset second output information set, wherein the output information in the second output information set is associated with the age and the gender.
In some embodiments, the output device determination unit is further configured to: and determining the output equipment which is positioned in the area where the target user is positioned and has the minimum distance with the target user as the target output equipment based on the position information.
In some embodiments, the apparatus further comprises: a second transmitting unit configured to transmit a control instruction to stop outputting the target information to the target output device in response to detecting that a distance between the target user and a placement position before the article is taken is greater than a preset threshold.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a storage device having one or more programs stored thereon; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation manner of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the method and the device for controlling the output equipment, firstly, the position information of the target user is obtained in response to the fact that the target user is detected to take the article. And then, based on the position information, determining the output equipment which is in the area where the target user is located and meets the preset conditions as the target output equipment. And finally, sending a control instruction for outputting the target information to the target output equipment. Therefore, the target output equipment can output the target information after the user takes the article.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for controlling an output device according to the present application;
FIG. 3 is a schematic diagram of one application scenario of a method for controlling an output device according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of a method for controlling an output device according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of an apparatus for controlling an output device according to the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing an electronic device according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which a method for controlling an output device or an apparatus for controlling an output device of embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include output devices 101, 102, 103, a network 104, and a controller 105. The network 104 is used to provide a medium for communication links between the output devices 101, 102, 103 and the controller 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The controller 105 transmits a message (control instruction) to the output devices 101, 102, 103 through the network 104.
The output devices 101, 102, 103 may be hardware or software. When the output devices 101, 102, 103 are hardware, they may be various devices that support information output, including but not limited to, a sound, a display, and the like. Furthermore, the output devices 101, 102, 103 may also be electronic devices, such as smart phones, tablets, laptops, etc. At this time, various client applications, such as an audio playing application, a picture displaying application, a video playing application, and the like, may be installed on the output devices 101, 102, 103. When the output devices 101, 102, 103 are software, they can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The controller 105 may be a controller that provides various services, such as a controller that sends control instructions to the output devices 101, 102, 103. The controller may send a control instruction to the output device based on the location information of the user after detecting that the user has taken the item.
It should be noted that the method for controlling the output device provided in the embodiment of the present application is generally performed by the controller 105, and accordingly, the apparatus for controlling the output device is generally disposed in the controller 105.
The controller may be hardware or software. When the controller is hardware, it may be implemented as a distributed controller cluster composed of a plurality of controllers, or may be implemented as a single controller. When the controller is software, it may be implemented as a plurality of software or software modules (for example, to provide distributed services), or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of output devices, networks, and controllers in fig. 1 is merely illustrative. There may be any number of output devices, networks, and controllers, as desired for an implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for controlling an output device in accordance with the present application is shown. The method for controlling the output device comprises the following steps:
step 201, in response to detecting that the target user takes the article, acquiring the position information of the target user.
In the present embodiment, an execution subject (e.g., the controller 105 in fig. 1) of the method for controlling an output apparatus may acquire position information of a target user in response to detecting that the target user takes an article. Wherein the target user may be any user. The target user determination can be specified by a technician or obtained by screening according to certain conditions. As an example, a user entering a supermarket may be set as a target user. The location information may be information related to a location, such as coordinates, addresses, orientations, and the like.
In this embodiment, the execution main body may acquire the location information of the target user in various ways.
As an example, a location (e.g., a shelf) where an item is placed may be provided with multiple sensors. When the user takes the article, the sensor at the original placement position of the article may send a signal containing the sensor identifier to the execution main body. The execution main body can inquire the received sensor identification in a preset corresponding relation table of the sensor identification and the sensor position information. Thereby obtaining sensor position information corresponding to the sensor identification. When the user takes the article, the distance between the user and the sensor at the original placement position of the taken article is short, and the sensor position information obtained by inquiry can be determined as the position information of the target user.
As an example, the execution body may receive location information transmitted from a terminal device used by a target user. Thereafter, the received location information is determined as location information of the target user. Taking a scene that a user enters an unmanned supermarket as an example, the user can use a terminal device (for example, a smartphone with a positioning application installed) to send user information (for example, a user face image, a user identifier, and the like) of the user to the execution main body by scanning a two-dimensional code and the like. And the terminal device can upload the position information of the user to the executive body in the shopping process of the user.
In this embodiment, the execution subject may detect that the target user takes the item in various ways.
As an example, the executing subject may obtain a depth image of the target user and analyze the depth image, thereby detecting that the user takes an item. Specifically, the acquired depth image may be converted into a bone point image. In the skeleton point image, a plurality of points may be used to represent joints of the human body, and these points may also be called joint points. As an example, 20 joint points may be included in the bone point image. On this basis, a technician can set a standard bone point image of the article to be taken. Thereafter, the similarity of the bone point image of the target user and the standard bone point image may be calculated. When the similarity is greater than a preset threshold (e.g., 90%), it may be considered that the target user's taking of the item is detected.
As an example, a location (e.g., a shelf) where an item is placed may be provided with multiple sensors. When the user takes the article, the sensor of the original placement position of the article may send a signal to the execution main body. Therefore, the execution body can detect that the target user takes the article. It should be noted that, at this time, the user who takes the item may be determined as the target user.
As an example, the execution body may first acquire an image containing an image of the user captured by a camera provided in a preset range of an article placement position (e.g., a shelf). Then, the shot image can be input into a pre-trained gesture recognition model, and recognition result information used for representing the gesture category of the user is obtained. In response to the recognition result information representing that the user gesture category is a pickup item, the execution subject may detect that the target user picks up the item. As an example, the gesture recognition model may be obtained by training through the following steps:
in a first step, a set of training samples is obtained. Wherein each training sample may include a sample image and annotation information for the sample image. And the annotation information of the sample image is used for representing the category of the human body posture displayed in the sample image. By way of example, the categories of body gestures may be to take items, to put items down, and the like.
And secondly, acquiring an initial posture recognition model. The initial pose recognition model can be a variety of existing image classification models. By way of example, there may be a Residual Network (ResNet), VGG, and so on. VGG is a classification model proposed by the Visual Geometry Group (VGG) of university.
And thirdly, taking the sample image of the training sample in the training sample set as the input of the initial posture recognition model, taking the marking information corresponding to the input sample image as the expected output of the initial posture recognition model, and training to obtain the posture recognition model. Specifically, a sample image of the training sample may be input to the initial pose recognition model, and recognition result information corresponding to the sample image may be obtained. And calculating the difference between the obtained identification result information and the labeling information by using a preset loss function. Then, based on the obtained difference, parameters of the initial posture recognition model may be adjusted, and in the case that a preset training end condition is satisfied, the training is ended, and the trained initial posture recognition model is used as the posture recognition model. The training end condition includes, but is not limited to, at least one of the following: the training time exceeds the preset time; the training times reach the preset times; the calculated difference is less than a preset difference threshold.
Here, the parameters of the initial pose recognition model may be adjusted in various ways based on the difference between the obtained recognition result information and the labeling information corresponding to the input training sample. For example, a BP (Back Propagation) algorithm or an SGD (Stochastic Gradient Descent) algorithm may be used to adjust the parameters of the initial pose recognition model.
The execution subject of the training step may be the same as or different from the execution subject of the method for controlling the output device. If the two types of the network structures are the same, the executing body can store the network structures and the parameter values of the trained gesture recognition model in the local after the gesture recognition model is obtained through training. If not, after the executing agent of the training step obtains the gesture recognition model through training, the network structure and the parameter values of the model can be sent to the executing agent of the method for controlling the output device.
Step 202, based on the position information, determining the output device meeting the preset condition in the area where the target user is located as the target output device.
In this embodiment, the execution subject may determine, as the target output device, an output device that meets a preset condition in the area where the target user is located, based on the location information. The area in which the target user is located may be preset or may be determined according to a certain condition. Taking a scene that a user enters an unmanned supermarket as an example, the area where the target user is located can be preset to be all areas in the supermarket. In addition, the area where the target user is located may also be a fixed-size area with the position where the target user is located as a center and a preset value as a radius.
In this embodiment, the execution subject may determine whether the output device is located within the area where the target user is located by querying the location information of the output device. As an example, the distance between the output device and the target user may be calculated from the coordinates of the output device and the coordinates of the target user. If the calculated distance is less than the radius of the area in which the target user is located, it may be determined that the output device is within the area in which the target user is located.
In the present embodiment, the preset condition may be various conditions. As an example, it may be a preset position at the target user. As an example, it may also be fault-free. Taking the preset condition as no fault as an example, the execution main body may query the state of the output device, so as to determine whether the output device is faulty. Wherein the state of the output device may be entered by a technician. As an example, a fault may be represented by a number "0". No fault is indicated by the number "1". Thus, the execution body may determine an output device in the state of "1" in the area where the target user is located as the target output device.
Step 203, sending a control instruction for outputting the target information to the target output device.
In this embodiment, upon determining the target output device in step 202, the execution subject may send a control instruction of outputting the target information to the target output device. The target information may be information specified by a technician or information determined according to a certain condition.
In some optional implementations of this embodiment, the method for controlling an output device may further include: acquiring article information of an article; in a preset first output information set, output information associated with the article information is determined. Wherein the output information in the first output information set is associated with the item information. The determined output information is determined as target information.
In some optional implementations of this embodiment, sending a control instruction for outputting the target information to the target output device includes: and sending a control instruction for outputting preset output information to the target output equipment.
In some optional implementations of this embodiment, the method for controlling an output device may further include: and sending a control instruction for stopping outputting the target information to the target output equipment in response to detecting that the distance between the target user and the placement position before the object is taken is greater than a preset threshold value.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for controlling an output device according to the present embodiment. In the application scenario of fig. 3, the item is placed on a shelf 302, the shelf 302 being provided with at least one pressure sensor. The execution subject of the method for controlling the output device may be a backend server 300 of a supermarket. Background server 300 is in communication connection with depth camera 301. The depth camera 301 is used for shooting an image of the shelf 302 and sending a shot depth image 303 to the background server 300. The backend server 300 may analyze the received depth image 303. Specifically, the depth image 303 may be converted into a bone point image 304. The backend server 300 may then calculate the similarity between the skeletal point image 304 and the standard skeletal point image 305 of the picked item. When the calculated similarity is greater than the preset threshold, the background server 300 detects that the user takes the article. Meanwhile, the background server 300 may receive a signal containing a sensor identifier sent by a pressure sensor on the shelf 302. The execution subject may query the sensor identifier in the preset correspondence table 306 between the sensor identifier and the sensor location information. Thereby obtaining sensor location information 307 corresponding to the sensor identification. Since the user is close to the sensor at the original placement position of the article to be picked up when picking up the article, the sensor position information 307 obtained by the inquiry can be used as the position information of the target user.
Based on the position information 307, the display screen 308 and the sound 309 in the area where the target user is located without malfunction are determined as the target output device. A control instruction to output the target information is sent to the display screen 308 and the sound 309. As an example, the control instruction may control the display screen 308 to display a picture of a beach landscape and the audio 309 to play audio of a sea wave sound.
In the method provided by the above embodiment of the present application, first, in response to detecting that the target user takes an article, the location information of the target user is obtained. And then, based on the position information, determining the output equipment which is in the area where the target user is located and meets the preset conditions as the target output equipment. And finally, sending a control instruction for outputting the target information to the target output equipment. Therefore, when the target user takes the article, the target output equipment can output the target information, and the target information is displayed.
With further reference to FIG. 4, a flow 400 of one embodiment of a method for controlling an output device is shown. The process 400 of the method for controlling an output device includes the steps of:
step 401, in response to detecting that the target user takes the item, acquiring location information of the target user.
In this embodiment, the specific implementation and technical effects of step 401 are similar to those of step 201 in the embodiment corresponding to fig. 2, and are not described herein again.
Step 402, based on the position information, determining the output device in the area where the target user is located and having the smallest distance to the target user as the target output device.
In this embodiment, the execution subject of the method for controlling an output device may determine, as a target output device, an output device whose distance from a target user is the smallest in an area in which the target user is located. As an example, the execution subject may first determine the output devices within the area where the target user is located. On the basis, for the output device in the area where the user is located, the execution main body can acquire the distance between the output device and the target user. And then selecting the output equipment with the minimum distance from the target user. And then, determining the selected output device with the minimum distance from the target user as the target output device.
In step 403, the age and gender of the target user are obtained.
In this embodiment, the execution subject may acquire the age and sex of the target user in various ways.
As an example, the execution subject described above may receive the age and sex of the user transmitted from the terminal device used by the target user. Specifically, taking a scene that a user enters an unmanned supermarket as an example, the user may use a terminal device (for example, a smartphone with a positioning application installed therein) to send user information (user gender, age, and the like) of the user to the execution subject by scanning a two-dimensional code and the like.
As an example, the execution body may acquire an image of the target user. And obtains the age and gender of the target user through facial recognition. It should be noted that the face recognition technology is now widely used in various photographing applications or face recognition applications. In practice, the execution subject may directly call an API (Application Programming Interface) of some items (for example, Oxford item) to identify the image of the user, so as to obtain the gender, age, and the like of the user.
In step 404, output information associated with age and gender is determined in a preset second output information set.
In this embodiment, the execution subject may determine the output information associated with age and gender in a preset second output information set. Wherein the output information in the second output information set is associated with age and gender. The second output information combination may be a collection of various information. Such as text, pictures, audio, etc. It is to be understood that the first and second of the first and second output information sets are only for distinguishing the output information sets, and do not limit the output information sets in any way. In practice, age and gender may be considered as a doublet. And each doublet may have an output information associated with it. The execution subject may match the age and gender of the user in the second set of output information. And obtaining the binary group with the highest similarity to the age and the gender of the user. And output information associated with this two-tuple is taken as output information associated with the age and gender of the user.
In step 405, the determined output information is determined as target information.
In this embodiment, the execution subject may determine the output information associated with the age and sex of the user obtained in step 404 as the target information.
Step 406, sending a control instruction for outputting the target information to the target output device.
In this embodiment, the executing entity may send a control instruction to the target output device obtained in step 402, so that the target output device outputs the target information.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the method for controlling an output device in the present embodiment obtains the age and sex of the user, thereby obtaining output information associated with the age and sex. Thereby, the relevance and pertinence of the output information to the age and gender of the user are realized.
With further reference to fig. 5, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for controlling an output device, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the apparatus 500 for controlling an output device of the present embodiment includes: a position information acquisition unit 501, an output device determination unit 502, and a first transmission unit 503. Wherein, the location information obtaining unit 501 is configured to obtain location information of the target user in response to detecting that the target user takes the item. The output device determination unit 502 is configured to determine, as a target output device, an output device that meets a preset condition within an area where a target user is located, based on the position information. The first transmission unit 503 is configured to transmit a control instruction of outputting the target information to the target output apparatus.
In some optional implementations of this embodiment, the apparatus 500 may further include: an article information acquisition unit (not shown in the figure), a first output information determination unit (not shown in the figure), and a first object information determination unit (not shown in the figure). Wherein the item information acquisition unit is configured to acquire item information of the item. The first output information determination unit is configured to determine output information associated with the article information among a preset first output information set. Wherein the output information in the first output information set is associated with the item information. A first target information determination unit. Is configured to determine the determined output information as target information.
In some optional implementations of this embodiment, the apparatus 500 may further include: a user information acquisition unit (not shown in the figure), a second output information determination unit (not shown in the figure), and a second object information determination unit (not shown in the figure). Wherein the user information acquisition unit is configured to acquire user information of a target user. A second output information determination unit. Configured to determine output information associated with the user information in a preset second output information set. Wherein the output information in the second output information set is associated with the user information. A second target information determination unit. Is configured to determine the determined output information as target information.
In some optional implementations of this embodiment, the first sending unit may be further configured to: and sending a control instruction for outputting preset output information to the target output equipment.
In some optional implementations of this embodiment, the user information obtaining unit may be further configured to: acquiring the age and the gender of a target user, and determining output information associated with user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information, and the method comprises the following steps: and determining output information associated with the age and the gender in a preset second output information set, wherein the output information in the second output information set is associated with the age and the gender.
In some optional implementations of this embodiment, the apparatus 500 may further include: a second sending unit (not shown in the figure). Wherein the second transmitting unit is configured to transmit a control instruction to stop outputting the target information to the target output device in response to detecting that the distance between the target user and the placement position before the article is taken is greater than a preset threshold.
In some optional implementations of this embodiment, the output device determining unit may be further configured to: and determining the output equipment which is positioned in the area where the target user is positioned and has the minimum distance with the target user as the target output equipment based on the position information.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a position information acquisition unit, an output device determination unit, and a first transmission unit. Where the names of these units do not constitute a limitation on the units themselves in some cases, for example, the location information acquiring unit may also be described as a "unit that acquires location information of a target user in response to detecting that the target user takes an item".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the server, cause the electronic device to: in response to detecting that the target user takes the article, acquiring position information of the target user; determining output equipment which meets preset conditions in the area where the target user is located as target output equipment based on the position information; and sending a control instruction for outputting the target information to the target output device.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. This is information specified by the technician, and may be information determined according to a certain condition. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (16)

1. A method for controlling an output device, comprising:
in response to detecting that a target user takes an article, acquiring position information of the target user;
determining output equipment which meets preset conditions in the area where the target user is located as target output equipment based on the position information;
and sending a control instruction for outputting the target information to the target output equipment.
2. The method of claim 1, wherein the method further comprises:
acquiring article information of the article;
determining output information associated with the article information in a preset first output information set, wherein the output information in the first output information set is associated with the article information;
and determining the determined output information as the target information.
3. The method of claim 1, wherein the method further comprises:
acquiring user information of the target user;
determining output information associated with the user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information;
and determining the determined output information as the target information.
4. The method of claim 1, wherein the sending control instructions to the target output device to output target information comprises:
and sending a control instruction for outputting preset output information to the target output equipment.
5. The method of claim 3, wherein the obtaining user information of the target user comprises:
acquiring the age and the gender of the target user; and
determining output information associated with the user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information, and the method comprises the following steps:
and determining output information associated with the age and the gender in a preset second output information set, wherein the output information in the second output information set is associated with the age and the gender.
6. The method according to any one of claims 1 to 5, wherein the determining, based on the location information, an output device meeting a preset condition in an area where the target user is located as a target output device includes:
and determining the output equipment which is in the area where the target user is located and has the minimum distance with the target user as the target output equipment based on the position information.
7. The method of claim 6, wherein the method further comprises:
sending a control instruction for stopping outputting target information to the target output equipment in response to detecting that the distance between the target user and the placement position before the article is taken is larger than a preset threshold value.
8. An apparatus for controlling an output device, comprising:
a position information acquisition unit configured to acquire position information of a target user in response to detection of a pickup of an item by the target user;
an output device determination unit configured to determine, as a target output device, an output device that meets a preset condition in an area where the target user is located, based on the position information;
a first transmission unit configured to transmit a control instruction to output target information to the target output apparatus.
9. The apparatus of claim 8, wherein the apparatus further comprises:
an item information acquisition unit configured to acquire item information of the item;
a first output information determination unit configured to determine output information associated with the item information in a preset first output information set, wherein the output information in the first output information set is associated with the item information;
a first target information determination unit configured to determine the determined output information as the target information.
10. The apparatus of claim 8, wherein the apparatus further comprises:
a user information acquisition unit configured to acquire user information of the target user;
a second output information determination unit configured to determine output information associated with the user information in a preset second output information set, wherein the output information in the second output information set is associated with the user information;
a second target information determination unit configured to determine the determined output information as the target information.
11. The apparatus of claim 8, wherein the first transmitting unit is further configured to:
and sending a control instruction for outputting preset output information to the target output equipment.
12. The apparatus of claim 10, wherein the user information acquisition unit is further configured to:
obtaining the age and sex of the target user, an
And determining output information associated with the age and the gender in a preset second output information set, wherein the output information in the second output information set is associated with the age and the gender.
13. The apparatus of any of claims 8-12, wherein the output device determination unit is further configured to:
and determining the output equipment which is in the area where the target user is located and has the minimum distance with the target user as the target output equipment based on the position information.
14. The apparatus of claim 13, wherein the apparatus further comprises:
a second sending unit configured to send a control instruction to stop outputting the target information to the target output device in response to detecting that a distance between the target user and a placement position before the article is taken is greater than a preset threshold.
15. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
16. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-7.
CN201810720021.9A 2018-07-03 2018-07-03 Method and apparatus for controlling output device Pending CN110673717A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810720021.9A CN110673717A (en) 2018-07-03 2018-07-03 Method and apparatus for controlling output device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810720021.9A CN110673717A (en) 2018-07-03 2018-07-03 Method and apparatus for controlling output device

Publications (1)

Publication Number Publication Date
CN110673717A true CN110673717A (en) 2020-01-10

Family

ID=69065589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810720021.9A Pending CN110673717A (en) 2018-07-03 2018-07-03 Method and apparatus for controlling output device

Country Status (1)

Country Link
CN (1) CN110673717A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112561453A (en) * 2021-02-24 2021-03-26 北京每日优鲜电子商务有限公司 Device control method, device, electronic device and computer readable medium
CN113312947A (en) * 2020-02-27 2021-08-27 北京沃东天骏信息技术有限公司 Method and device for determining behavior object
CN113450476A (en) * 2020-03-27 2021-09-28 本田技研工业株式会社 Control device, apparatus, computer-readable storage medium, and control method
CN114684067A (en) * 2020-12-31 2022-07-01 博泰车联网科技(上海)股份有限公司 Trunk monitoring method and system and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150206188A1 (en) * 2014-01-17 2015-07-23 Panasonic Intellectual Property Corporation Of America Item presentation method, and information display method
US20160253735A1 (en) * 2014-12-30 2016-09-01 Shelfscreen, Llc Closed-Loop Dynamic Content Display System Utilizing Shopper Proximity and Shopper Context Generated in Response to Wireless Data Triggers
CN107172209A (en) * 2017-07-04 2017-09-15 百度在线网络技术(北京)有限公司 Information-pushing method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150206188A1 (en) * 2014-01-17 2015-07-23 Panasonic Intellectual Property Corporation Of America Item presentation method, and information display method
US20160253735A1 (en) * 2014-12-30 2016-09-01 Shelfscreen, Llc Closed-Loop Dynamic Content Display System Utilizing Shopper Proximity and Shopper Context Generated in Response to Wireless Data Triggers
CN107172209A (en) * 2017-07-04 2017-09-15 百度在线网络技术(北京)有限公司 Information-pushing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
林强等: "《行为识别与智能计算》", 西安电子科技大学出版社, pages: 67 - 69 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113312947A (en) * 2020-02-27 2021-08-27 北京沃东天骏信息技术有限公司 Method and device for determining behavior object
CN113450476A (en) * 2020-03-27 2021-09-28 本田技研工业株式会社 Control device, apparatus, computer-readable storage medium, and control method
CN114684067A (en) * 2020-12-31 2022-07-01 博泰车联网科技(上海)股份有限公司 Trunk monitoring method and system and storage medium
CN112561453A (en) * 2021-02-24 2021-03-26 北京每日优鲜电子商务有限公司 Device control method, device, electronic device and computer readable medium

Similar Documents

Publication Publication Date Title
CN108830235B (en) Method and apparatus for generating information
US10762387B2 (en) Method and apparatus for processing image
CN110673717A (en) Method and apparatus for controlling output device
CN106846497B (en) Method and device for presenting three-dimensional map applied to terminal
WO2020062493A1 (en) Image processing method and apparatus
CN109255337B (en) Face key point detection method and device
CN110059623B (en) Method and apparatus for generating information
CN108229375B (en) Method and device for detecting face image
CN108235004B (en) Video playing performance test method, device and system
CN110555876B (en) Method and apparatus for determining position
CN110619807B (en) Method and device for generating global thermodynamic diagram
CN110209658B (en) Data cleaning method and device
CN110059624B (en) Method and apparatus for detecting living body
CN113033677A (en) Video classification method and device, electronic equipment and storage medium
CN111340015A (en) Positioning method and device
CN111860071A (en) Method and device for identifying an item
CN108038473B (en) Method and apparatus for outputting information
CN111767456A (en) Method and device for pushing information
CN110414625B (en) Method and device for determining similar data, electronic equipment and storage medium
CN110413869B (en) Method and device for pushing information
CN109840059B (en) Method and apparatus for displaying image
CN107942692B (en) Information display method and device
CN108446737B (en) Method and device for identifying objects
CN112307323A (en) Information pushing method and device
CN111325160A (en) Method and apparatus for generating information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200110

RJ01 Rejection of invention patent application after publication