CN113170078A - Animation display device and animation display method - Google Patents

Animation display device and animation display method Download PDF

Info

Publication number
CN113170078A
CN113170078A CN201880100035.9A CN201880100035A CN113170078A CN 113170078 A CN113170078 A CN 113170078A CN 201880100035 A CN201880100035 A CN 201880100035A CN 113170078 A CN113170078 A CN 113170078A
Authority
CN
China
Prior art keywords
parameter
display
image
animation display
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880100035.9A
Other languages
Chinese (zh)
Inventor
片冈竜成
坂田礼子
相川真实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN113170078A publication Critical patent/CN113170078A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The animation display device includes an image database storing image data constituting animation display, and a control unit for performing the animation display, wherein the control unit receives a fluidity parameter set as an environment parameter and a user parameter, receives a non-fluidity parameter set as a display parameter in advance, and performs animation display by flexibly changing a display process of the image data acquired from the image database based on the fluidity parameter and the non-fluidity parameter.

Description

Animation display device and animation display method
Technical Field
The present invention relates to an animation display device and an animation display method for changing the fluidity of animation display.
Background
As a conventional technique, there is an electronic device capable of changing an image to be displayed by a simple operation (for example, see patent document 1). The electronic device according to patent document 1 includes a control unit that changes an image to be displayed on an image display unit based on a detected motion of an object.
Specifically, in patent document 1, the motion of an object is stored in a storage section in correspondence with the operation of a moving image. For example, a relation such as fast forwarding if the operator draws a clockwise circle by hand, rewinding if a counterclockwise circle is drawn, or the like is stored in the storage section.
When such a correspondence relationship is stored, for example, when the user performs an operation of drawing a clockwise circle in the sky by hand, the control unit analyzes the operation and operates the projection image so as to perform a fast forward operation. In patent document 1, when displaying subtitles as a video, the display, non-display, or type of subtitle to be displayed can be switched according to the movement of the user.
That is, the electronic apparatus according to patent document 1 can change a screen to be displayed in accordance with the motion of the user, and the user can change a display image only by performing a previously associated operation.
Patent document 1: japanese patent No. 5254906
Disclosure of Invention
Problems to be solved by the invention
As described above, the electronic apparatus according to patent document 1 can change the display image in accordance with the motion of the user stored in advance. However, when a user who can visually recognize a display image performs animation display for urging a moving direction in a situation where there are many unspecified display places such as other persons other than the user, if the display image is changed only in accordance with a predetermined specific motion of the user, it is not always possible to realize appropriate animation display.
In the case of performing animation display for urging the movement direction to the user, it is considered that more appropriate animation display can be realized by considering parameters related to the movement of the fluidity of the user and parameters other than the movement of the user.
In other words, if a user who can visually recognize a moving image display wants to perform a moving image display that urges a moving direction in a situation where there are a large number of unspecified other people other than the user, the following problem arises if the technique according to patent document 1 is directly applied. That is, the technique of patent document 1 can perform only a predetermined specific animation display according to the parameter relating to the user's motion that changes smoothly according to the situation, and cannot perform an animation display in consideration of the parameters other than the user's motion. Therefore, it is desirable to perform animation display that makes it easier for a user who can visually recognize animation display to recognize a moving direction.
The present invention has been made to solve the above-described problems, and an object of the present invention is to obtain an animation display device and an animation display method for performing animation display in which a moving direction is easily recognized for a user in accordance with a situation where a display place where a large number of unspecified persons other than the user exist.
Means for solving the problems
The present invention relates to a moving picture display device, which comprises: an image database storing image data constituting animation display; and a control unit that performs animation display, the control unit receiving a fluidity parameter set to an environment parameter that can be collected at a location where animation display is performed and a user parameter that can be collected in correspondence with a user who visually recognizes the animation display, the control unit receiving a non-fluidity parameter set in advance as a display parameter, the control unit performing animation display by changing a display process of image data acquired from the image database in a fluidity manner based on the fluidity parameter and the non-fluidity parameter.
Further, the animation display method according to the present invention includes the steps of: receiving, as fluidity parameters, environmental parameters that can be collected at a place where animation display is performed and user parameters that can be collected in correspondence with a user who visually recognizes the animation display; receiving a non-liquidity parameter preset as a display parameter; acquiring image data for animation display from an image database in which image data constituting animation display is stored, based on the fluidity parameter and the non-fluidity parameter; the display processing of the acquired image data is changed in a fluidity manner based on the fluidity parameter and the non-fluidity parameter, thereby performing animation display.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, there is provided a configuration for performing animation display by fluently changing the display processing of image data based on a preset non-fluidity parameter acquired as a display parameter and a fluidity parameter acquired as an environmental parameter and a user parameter. As a result, it is possible to obtain an animation display device and an animation display method that perform animation display in which the moving direction is easily recognized for the user in accordance with the situation where there are many unspecified other persons other than the user, or the like.
Drawings
Fig. 1 is a functional block diagram showing a configuration of a moving image display device according to embodiment 1 of the present invention.
Fig. 2 is an explanatory diagram showing a data structure of each database when the method 1 in embodiment 1 of the present invention is applied.
Fig. 3 is a flowchart showing a series of processes when the control unit according to embodiment 1 of the present invention executes animation display by applying method 1.
Fig. 4 is an explanatory diagram showing a data structure of each database when method 2 in embodiment 1 of the present invention is applied.
Fig. 5 is a flowchart showing a series of processes when the control unit according to embodiment 1 of the present invention executes animation display by applying method 2.
Fig. 6 is an explanatory diagram showing a data structure of each database when method 3 in embodiment 1 of the present invention is applied.
Fig. 7 is a flowchart showing a series of processes when the control unit according to embodiment 1 of the present invention executes animation display by applying method 3.
Fig. 8 is a diagram showing a series of image data used in specific example 1 in embodiment 1 of the present invention.
Fig. 9 is a diagram showing an example of a fluidity change of animation display based on the display parameter, the environmental parameter, and the user parameter using the method 2 with respect to the series of image data shown in fig. 8.
Fig. 10 is a diagram showing a state in which the animation display described in fig. 8 and 9 is performed.
Fig. 11 is a diagram showing a state in which the animation display described in fig. 8 and 9 is performed.
Fig. 12 is a diagram showing a state in which a movie display is performed at an event venue.
Fig. 13 is a configuration diagram showing a case where each function of the moving image display device according to embodiment 1 of the present invention is realized by a processing circuit which is dedicated hardware.
Fig. 14 is a configuration diagram showing a case where each function of the moving image display device according to embodiment 1 of the present invention is realized by a processing circuit including a processor and a memory.
(description of reference numerals)
10: an animation display device; 11: a control unit; 12: a projection unit; 13: an image database; 21: a fixed indication database; 22: a database of surrounding environments; 23: a user information database.
Detailed Description
Hereinafter, an example of an embodiment of a moving image display device and a moving image display method according to the present invention will be described with reference to the drawings.
Embodiment 1.
Fig. 1 is a functional block diagram showing a configuration of a moving image display device according to embodiment 1 of the present invention. The moving image display device 10 according to embodiment 1 is configured to include a control unit 11, a projection unit 12, and an image database 13. The control unit 11 is configured to be able to acquire data necessary for displaying a moving image from the fixed instruction database 21, the surrounding environment database 22, and the user information database 23.
The image database 13 stores image data constituting a moving image display. The display parameters, the environment parameters, and the user parameters are input to the control unit 11 as external signals.
Here, the display parameter is a parameter that can be changed in setting by an administrator or the like of a facility that performs animation display in order to specify an image to be used for animation display. The display parameter corresponds to a non-flowability parameter that does not change in animation display. Further, although the configuration example in which the display parameters are input as the external parameters has been described, the animation display device 10 may be provided with an input unit (not shown) and the manager of the facility may directly input the display parameters via the input unit.
The environmental parameters are parameters that can be collected at the location where the animation display is performed. The user parameters are parameters that can be collected in correspondence with the user viewing the animation display. The environment parameter and the user parameter correspond to a flowability parameter that changes in flowability in the animation display.
The control unit 11 extracts various parameters for liquidity-changing the animation display from the fixed instruction database 21, the surrounding environment database 22, and the user information database 23 based on the liquidity parameter and the non-liquidity parameter received as the external signal, and acquires image data from the image database. Then, the control unit 11 performs animation display by changing the display processing of the acquired image data in a fluid manner based on the extracted various parameters.
In the configuration shown in fig. 1, the case where the control unit 11 performs animation display via the projection unit 12 is exemplified. However, the control unit 11 may include the function of the projection unit 12 so that the control unit 11 directly performs animation display. In the configuration shown in fig. 1, the case of the moving image display device 10 including the control unit 11 and the projection unit 12 is exemplified. However, the animation display device 10 may be configured as follows: only the control unit 11 is provided, and the control unit 11 transmits information on a moving image to be projected to a projection device different from the moving image display device 10, which is provided with the projection unit 12.
Specific methods for performing animation display by the control unit 11 by changing the display process of image data in a fluid manner based on the basic configuration of fig. 1 are exemplified by the following four methods, and specific contents will be described individually.
[ method 1] method 1 has the following characteristics.
The series of image data itself displayed without change.
The degree of change of the animation is determined based on a combination of parameters acquired from each of the fixed instruction database 21, the ambient environment database 22, and the user information database 23 by an external signal.
The image data for projection according to the series of image data is changed in fluidity according to the obtained degree of change, thereby performing the animation display.
The following contents can be given as examples of display contents for performing animation display.
Display content 1: gate on road display for airport
Display content 2: guidance display for activities in an event venue
Display content 3: guided display of activities in theme parks
In the following example, an example of guiding a user to an empty door in an airport lobby is described.
Fig. 2 is an explanatory diagram showing a data structure of each database when the method 1 in embodiment 1 of the present invention is applied. The following describes the data structure relating to each of the image database 13, the fixed instruction database 21, the ambient environment database 22, and the user information database 23, with reference to fig. 2.
< data Structure of image database 13 >
The image data constituting the animation display is stored in the image database 13 as a series of image data associated with the display parameters. Fig. 2 illustrates a case where a series of image data 1 including 3 images of IMG (1), IMG (2), and IMG (3) is stored in the image database 13.
< data Structure of fixed instruction database 21 >
The fixed instruction data stored in the fixed instruction database 21 is configured by associating respective components including a display parameter, a projection size, a projection standard speed, and a first parameter with each other. In fig. 2, the display parameters include information of the selected series of image data and information indicating the priority of guidance by the animation display.
The projection size is the outer size of an image to be displayed as a moving image, and in fig. 2, a value of 1m × 1m is set as a standard projection size. The projection standard speed is a projection speed for moving image display, and in fig. 2, 10cm/s is set as the standard projection speed. The projection size and the projection standard speed determine the display mode of the animation.
The first parameter is obtained by digitizing the information on the priority included in the display parameter, and is set as follows in the example of fig. 2.
In the case of low priority, the first parameter is 1
In the case of a medium priority, the first parameter is 2
In the case of a high priority, the first parameter is 3
< data Structure of ambient Environment database 22 >
The ambient environment data stored in the ambient environment database 22 is configured by associating respective components including an environmental parameter and a second parameter with each other. In fig. 2, the environment parameter is configured to include information on the degree of congestion of the target door in the airport lobby. The second parameter is obtained by digitizing the information on the degree of congestion included in the environment parameter, and is set as follows in the example of fig. 2.
When the congestion degree is high, the second parameter is 1
When the congestion degree is medium, the second parameter is 2
When the congestion degree is low, the second parameter is 3
The reason why the second parameter value is smaller as the degree of congestion of the subject door is higher is that the higher the degree of congestion is, the lower the necessity of guiding the user to the subject door is.
The environment parameter may include at least one of data relating to weather, illuminance, raw material of the projection destination, color temperature of the projection destination, and the like at a place where the moving image is displayed, in addition to the data relating to the degree of congestion at a predetermined place in the system use environment as described above.
< data Structure of user information database 23 >
The user information data stored in the user information database 23 is configured by associating respective components including a user parameter and a third parameter with each other. In fig. 2, the user parameter includes information on the moving speed of the user who visually recognizes the moving image display. The third parameter is obtained by digitizing information on the moving speed of the user, and is set as follows in the example of fig. 2.
When the moving speed is 0.0< X ≦ 5.0, the third parameter is 1
When the moving speed is 5.0< X ≦ 10.0, the third parameter is 2
In the case of a movement speed of 10.0< X, the third parameter is 3
The reason why the value of the third parameter is higher as the moving speed of the person is higher is that it is more difficult for the person having the higher moving speed to notice the projected image, and therefore, it is necessary to emphasize the moving image so as to be more noticeable.
The user parameter includes at least one of data relating to the physical characteristics of the user and data relating to nationality, a destination, a purpose, and the like, in addition to data relating to the moving speed of the user visually recognized by the animation display. The physical characteristics of the user include the age and sex of the user, and the presence or absence of use of a wheelchair or a bicycle.
Here, the display parameters, the environment parameters, and the user parameters described above will be described in addition.
The display parameters can be set and changed by the manager of the facility performing the animation display according to the current state of the facility. For example, when there is a event in an airport, the standard projection speed may be increased or the projection size may be reduced by setting a special animation display corresponding to the event to be held. By storing a series of image data corresponding to desired display parameters in the image database 13 and creating fixed instruction data corresponding to the desired display parameters, it is possible to realize animation display corresponding to a change in setting of the desired display parameters.
The environmental parameters are collected by, for example, a device such as a monitoring camera capable of quantitatively collecting the degree of congestion of the surrounding environment and the degree of congestion of the guidance destination for performing animation display.
The user parameters are collected by a device such as a speed sensor that can quantitatively collect the moving speed of the user in a place where the moving image is displayed. Alternatively, the user may hold a medium such as an ID card storing information related to the user, and the user parameter may be collected from the medium by an information acquisition device such as a card reader. In addition, the user parameters may also be collected by extracting the user parameters from information input by staff at the airport.
The values of the fixed instruction data in the fixed instruction database 21, the ambient environment data in the ambient environment database 22, and the user information data in the user information database 23 can be changed according to the situation.
Next, in method 1, a specific example will be described in which the projection conditions are changed by using the first parameter, the second parameter, and the third parameter specified from the external signal, and the animation is displayed.
When the first parameter is 1, the second parameter is 1, and the third parameter is 1, the control unit 11 can determine the projection speed for performing the animation display as follows.
The projection speed is equal to the first parameter multiplied by the second parameter multiplied by the third parameter multiplied by the projection standard speed
=1×1×1×10cm/s
=10cm/s
When the first parameter is 3, the second parameter is 3, and the third parameter is 3, the control unit 11 can determine the projection speed for performing the animation display as follows.
The projection speed is equal to the first parameter multiplied by the second parameter multiplied by the third parameter multiplied by the projection standard speed
=3×3×3×10cm/s
=270cm/s
The control unit 11 may determine the projection speed in consideration of a weighting factor preset for the first parameter, the second parameter, and the third parameter. The weighting factor can be set to a value between 0 and 1, for example. The above-described specific calculation example corresponds to a case where all the weight coefficients are 1.
The projection conditions changed based on the various parameters are not limited to the projection standard speed. For example, the control unit 11 may change the moving image display fluidity by changing the projection size of the projection image data based on various parameters.
In addition, it is also possible to individually store a series of image data to be displayed in the image database 13 according to the display parameters. In this case, the control unit 11 may acquire a series of image data corresponding to the first parameter from the image database 13 as the projection image data without using the first parameter for calculating the projection speed. Then, the control unit 11 calculates the projection speed using the second parameter and the third parameter, and can perform animation display using the projection image data corresponding to the first parameter.
As described above, according to the method 1, when a moving image is projected, it is possible to flexibly change projection image data based on parameters determined from different viewpoints, such as the first parameter which is a non-flowability parameter set in advance, and the second and third parameters which are flowability parameters that change in a flowability manner according to the situation when the moving image is projected. In other words, the method 1 can reflect the intention of the facility manager, the situation of the surrounding environment, and the personal information or situation of the user in real time on the projection image and display the projection image as a moving image.
Fig. 3 is a flowchart showing a series of processes when the control unit 11 according to embodiment 1 of the present invention executes animation display by applying the method 1. In step S301, the control unit 11 receives an external signal including a display parameter, an environment parameter, and a user parameter.
Next, in step S302, the control unit 11 acquires the projection size, the projection standard speed, and the first parameter corresponding to the display parameter from the fixed instruction database 21.
Next, in step S303, the control unit 11 acquires a second parameter corresponding to the environmental parameter from the ambient environment database 22. Next, in step S304, the control unit 11 acquires a series of image data from the image database 13 based on the setting of the display parameters.
Next, in step S305, the control unit 11 acquires a third parameter corresponding to the user parameter from the user information database 23. Next, in step S306, the control unit 11 changes the projection conditions of the series of image data based on the first to third parameters. Then, in step S307, the control unit 11 projects a series of image data according to the changed projection conditions, thereby executing animation display.
By executing such a series of processing, the control unit 11 can select an appropriate series of image data based on the first to third parameters and perform animation display under appropriate projection conditions.
[ method 2] method 2 has the following characteristics.
The projection image is divided into a plurality of parts and stored.
The image shape and the degree of change of the animation of each divided part are individually determined based on the combination of the parameters acquired from each of the fixed instruction database 21, the ambient environment database 22, and the user information database 23.
The image data for projection is generated by combining the respective portions, and the image data for projection is made to change in fluidity based on the degree of change in the animation determined for each portion individually, thereby displaying the animation.
Fig. 4 is an explanatory diagram showing a data structure of each database when method 2 in embodiment 1 of the present invention is applied. The following describes the data structure relating to each of the image database 13, the fixed instruction database 21, the ambient environment database 22, and the user information database 23, with reference to fig. 4.
< data Structure of image database 13 >
The partial data and the projection image data that is the overall image obtained by combining the partial data are stored in the image database 13 in association with the display parameters.
The 4 images IMG1, IMG (1) to IMG (3) shown in fig. 4 are the following images, respectively.
IMG 1: overall image in projection image 1
IMG1 (1): the first partial image forming the projection image 1
IMG1 (2): the second partial image forming the projection image 1
IMG1 (3): the third partial image forming the projection image 1
That is, fig. 4 illustrates the following case: the projection image 1 is composed of 3 partial images IMG1(1) to IMG1(3), and is stored in the image database 13 as data that is a total image IMG1 by merging the 3 partial images IMG1(1) to IMG1 (3). Further, the following description is made: the animation display device 10 changes the degree of change of the partial image IMG (1) according to the fixed instruction data, changes the degree of change of the partial image IMG (2) according to the environment data, and changes the degree of change of the partial image IMG (3) according to the user information.
< data Structure of fixed instruction database 21 >
The fixed instruction data stored in the fixed instruction database 21 is configured by associating respective components including a display parameter, a projection size, a projection standard speed, and a first parameter with each other. In fig. 4, the display parameters include information of the selected projection image and information indicating the priority of guidance displayed in a moving image.
The projection size is, for example, the outer size of the partial image IMG1(1), and in fig. 4, a value of 1m × 1m is set as a standard projection size. The projection standard speed is a projection speed for moving image display of the partial image IMG1(1), and in fig. 4, 10cm/s is set as a standard projection speed. That is, in the example of fig. 4, the projection size and the projection standard velocity determined as the fixed instruction data are used as the first display mode for determining the display mode of the first partial image IMG1 (1).
The first parameter is obtained by digitizing the information on the priority included in the display parameter, and is set as follows in the example of fig. 4.
In the case of low priority, the first parameter is 1
In the case of a medium priority, the first parameter is 2
In the case of a high priority, the first parameter is 3
< data Structure of ambient Environment database 22 >
The ambient environment data stored in the ambient environment database 22 is configured by associating respective components including an environmental parameter, a projection size, a projection standard speed, and a second parameter with each other. In fig. 4, the environment parameter is configured to include information on the degree of congestion of the target door in the airport lobby.
The projection size is, for example, the outer size of the partial image IMG1(2), and in fig. 4, a value of 1m × 1m is set as a standard projection size. The projection standard speed is a projection speed for animation display of the partial image IMG1(2), and in fig. 4, 12cm/s is set as a standard projection speed. That is, in the example of fig. 4, the projection size and the projection standard velocity determined as the ambient environment data are used as the second display mode for determining the display mode of the second partial image IMG1 (2).
The second parameter is obtained by digitizing the information on the degree of congestion included in the environment parameter, and is set as follows in the example of fig. 4.
When the congestion degree is high, the second parameter is 1
When the congestion degree is medium, the second parameter is 2
When the congestion degree is low, the second parameter is 3
< data Structure of user information database 23 >
The user information data stored in the user information database 23 is configured by associating respective components including a user parameter, a projection size, a projection standard speed, and a third parameter with each other. In fig. 4, the user parameter includes information on the moving speed of the user who visually recognizes the moving image display. The third parameter is obtained by digitizing information on the moving speed of the user, and is set as follows in the example of fig. 4.
When the moving speed is 0.0< X ≦ 5.0, the third parameter is 1
When the moving speed is 5.0< X ≦ 10.0, the third parameter is 2
In the case of a movement speed of 10.0< X, the third parameter is 3
The projection size is, for example, the outer size of the partial image IMG1(3), and in fig. 4, a value of 1m × 1m is set as a standard projection size. The projection standard speed is a projection speed for animation display of the partial image IMG1(3), and in fig. 4, 20cm/s is set as a standard projection speed. That is, in the example of fig. 4, the projection size and the projection standard velocity determined as the user information data are used as the third display mode for deciding the display mode of the third partial image IMG1 (3).
Next, in method 2, a specific example of performing animation display using the first parameter, the second parameter, and the third parameter determined from the external signal will be described.
When the first parameter is 1, the second parameter is 1, and the third parameter is 1, the control unit 11 can determine the projection speed for performing the animation display as follows.
Projection speed of the first partial image IMG1(1)
First parameter x projection standard velocity defined as fixed indication data
=1×10cm/s
=10cm/s
Projection speed of the second partial image IMG1(2)
Second parameter × projection standard velocity specified as ambient data
=1×12cm/s
=12cm/s
Projection speed of the third partial image IMG1(3)
The third parameter x is defined as the projection standard speed of the user information data
=1×20cm/s
=20cm/s
When the first parameter is 3, the second parameter is 3, and the third parameter is 3, the control unit 11 can determine the projection speed for performing the animation display as follows.
Projection speed of the first partial image IMG1(1)
First parameter x projection standard velocity defined as fixed indication data
=3×10cm/s
=30cm/s
Projection speed of the second partial image IMG1(2)
Second parameter × projection standard velocity specified as ambient data
=3×12cm/s
=36cm/s
Projection speed of the third partial image IMG1(3)
The third parameter x is defined as the projection standard speed of the user information data
=3×20cm/s
=60cm/s
As described above, the control unit 11 can individually determine the projection speeds of the respective partial images IGM1(1) to IMG (3) using the first to third parameters individually. The control unit 11 can also combine the partial images IGM1(1) to IMG (3) that specify the projection velocities corresponding to the degrees of change and display them as one animation.
In other words, a partial image in which the degree of change is desired to be set according to the display parameter, a partial image in which the degree of change is desired to be set according to the environment parameter, and a partial image in which the degree of change is desired to be set according to the user parameter can be individually registered. As a result, the display control of the moving image can be performed more finely based on the fluidity parameter and the non-fluidity parameter.
In the above example, the case where the projection speed is changed in accordance with each parameter has been described, but the present invention is not limited to such a control method. The control unit 11 can also perform display control of changing the projection size according to each parameter or changing the projection speed and the projection size together.
In addition, when the method 2 is adopted, the control unit 11 can determine each setting of the animation display in consideration of the weight coefficients preset for the first parameter, the second parameter, and the third parameter, as in the method 1.
In the above example, the case where the degree of change is determined for each of the partial images IMG1(1) to IMG1(3) based on one parameter has been described, but the present invention is not limited to such a control method. The control unit 11 may determine the degree of change of one partial image based on 2 or more parameters.
For example, when 2 parameters such as the degree of congestion and the air temperature exist as the environmental parameters, the projection size or the projection speed of the second partial image IMG1(2) may be determined based on a combination of the 2 parameters. Further, the projection size or the projection speed of the first partial image IMG1(1) may be determined based on a combination of the first parameter and the second parameter. That is, if the determined projection size or projection speed is in an appropriate range, the degree of change may be determined according to a combination of any parameters.
Fig. 5 is a flowchart showing a series of processes when the control unit 11 according to embodiment 1 of the present invention executes animation display by applying the method 2. In step S501, the control unit 11 receives an external signal including a display parameter, an environment parameter, and a user parameter.
Next, in step S502, the control unit 11 acquires a whole image and a plurality of partial images of the projection image corresponding to the display parameters from the image database 13. Here, in order to explain corresponding to fig. 4, the following explanation is made assuming that the whole image IMG1 and the 3 partial images IMG1(1) to IMG1(3) are acquired.
Next, in step S503, the control unit 11 acquires the projection size, the projection standard speed, and the first parameter corresponding to the display parameter from the fixed instruction database 21. Next, in step S504, the control unit 11 changes the projection conditions of the first partial image IMG1(1) in accordance with the projection size, the projection standard speed, and the first parameter acquired from the fixed instruction database 21.
Next, in step S505, the control unit 11 acquires the projection size, the projection standard speed, and the second parameter corresponding to the environmental parameter from the surrounding environment database 22. Next, in step S506, the control unit 11 changes the projection conditions of the second partial image IMG1(2) in accordance with the projection size, the projection standard speed, and the second parameter acquired from the ambient environment database 22.
Next, in step S507, the control unit 11 acquires the projection size, the projection standard speed, and the second parameter corresponding to the user parameter from the user information database 23. Next, in step S508, the control unit 11 changes the projection conditions of the third partial image IMG1(3) in accordance with the projection size, the projection standard speed, and the third parameter acquired from the user information database 23.
Next, in step S509, the control unit 11 merges the images of the respective portions and projects them in accordance with the changed projection conditions, thereby executing animation display.
By executing such a series of processing, the control unit 11 can select appropriate partial images based on the first to third parameters and display the partial images as animation under appropriate projection conditions.
[ method 3] method 3 has the following characteristics.
A series of image data to be displayed is divided into a base image and a guide image and stored.
The animation display is performed by changing the guide image in fluidity according to the degree of change.
Fig. 6 is an explanatory diagram showing a data structure of each database when method 3 in embodiment 1 of the present invention is applied. The following describes the data structure relating to each of the image database 13, the fixed instruction database 21, the ambient environment database 22, and the user information database 23, with reference to fig. 6.
< data Structure of image database 13 >
In the method 3, as shown in fig. 6, a series of image data to be displayed is classified into two types, a base image and a guide image, and stored in the image database 13.
Here, the basic image corresponds to an image that is displayed in a fixed manner in the animation display. The guide image corresponds to a plurality of images that are sequentially switched and displayed for animation display. In the method 3, animation display is performed by a combination of the base image and the plurality of guide images.
< data Structure of fixed instruction database 21 >
The fixed instruction data stored in the fixed instruction database 21 is configured by associating each component including the display parameter and the first parameter with each other. In fig. 6, the display parameters include information of the selected basic image.
The first parameter is obtained by digitizing the information on the base image included in the display parameter, and is set as follows in the example of fig. 6.
When the basic image 1 is set, the first parameter is 1
When the basic image 2 is set, the first parameter is 2
When the basic image 3 is set, the first parameter is 3
As described above, the display parameters are used for selecting the basic image by the administrator of the facility performing the animation display. The administrator can specify the basic image in advance by setting the display parameters. Further, the manager can easily change the setting of the animation display contents by selectively switching the desired basic image according to the current day's event, the current day's expected status of attendees, and the like.
< data Structure of ambient Environment database 22 >
The ambient environment data stored in the ambient environment database 22 is configured by associating respective components including an environmental parameter, a first parameter, and a second parameter with each other. In fig. 6, the environment parameter is configured to include information on the degree of congestion of the target door in the airport lobby.
In addition, the first parameter is a value determined from the fixed indication data. The second parameter is obtained by digitizing a guide image specified by a combination of the information on the degree of congestion included in the environment parameter and the specified first parameter, and is set as follows in the example of fig. 6.
When the congestion degree is high and the first parameter is 1, the second parameter is 1
When the congestion degree is medium and the first parameter is 1, the second parameter is 2
When the congestion degree is low and the first parameter is 1, the second parameter is 3
Thus, the guide image can be selected according to the combination of the environmental parameter and the display parameter.
< data Structure of user information database 23 >
The user information data stored in the user information database 23 is configured by associating respective components including a user parameter and a third parameter with each other. In fig. 6, the user parameter includes information on the moving speed of the user who visually recognizes the moving image display. The third parameter is obtained by digitizing a projection speed corresponding to information on the movement speed of the user, and is set as follows in the example of fig. 6.
When the moving speed is 0.0< X ≦ 5.0, the third parameter is 10cm/s
In the case of a moving speed of 5.0< X.ltoreq.10.0, the third parameter is 20cm/s
In the case of a movement speed of 10.0< X, the third parameter is 30cm/s
In this way, in the method 3, the user information data in which the third parameter is associated with the projection speed of the projection image is stored in advance in the user information database 23. Thus, the projection speed at which the guide images are sequentially displayed can be selected in accordance with the third parameter.
Fig. 7 is a flowchart showing a series of processes when the control unit 11 according to embodiment 1 of the present invention executes animation display by applying the method 3. In step S701, the control unit 11 receives an external signal including a display parameter, an environment parameter, and a user parameter.
Next, in step S702, the control unit 11 acquires a first parameter corresponding to a display parameter for specifying the base image from the fixed instruction database 21. Next, in step S703, the control unit 11 acquires a second parameter corresponding to the environmental parameter from the ambient environment database 22. Next, in step S704, the control unit 11 acquires the base image from the image database 13 based on the setting of the first parameter, and acquires the guide image from the image database 13 based on the setting of the first parameter and the second parameter.
Next, in step S705, the control unit 11 acquires a third parameter corresponding to the user parameter from the user information database 23. Next, in step S706, the control unit 11 changes the projection condition of the guide image in accordance with the third parameter. Then, in step S707, the control unit 11 projects the base image and projects the guide image in accordance with the changed projection conditions, thereby executing animation display.
By executing such a series of processing, the control unit 11 can select an appropriate base image and an appropriate guide image based on the first to third parameters, and perform animation display under appropriate projection conditions.
[ method 4] method 4 has the following characteristics.
Animation display is performed by using the projection image data to change smoothly based on the guidance result of the user.
In method 4, animation display is performed by any of the methods 1 to 3 described above, and the result of whether guidance has succeeded or failed is fed back, and the result of this feedback is used to change the next animation display.
Specifically, in the method 4, the control unit 11 acquires the movement result information of the user as the execution result of the animation display, and performs the animation display by further flexibly changing the projection image data in accordance with the movement result information. For example, in the method 1, the control unit 11 receives, as an external signal, movement result information indicating a result of performing animation display with the projection speed determined to be 10 cm/s.
When movement result information indicating that the guidance of the user has succeeded is obtained as an external signal, the control unit 11 maintains the current projection speed. On the other hand, when movement result information indicating that the user has failed to guide the user is obtained as the external signal, the control unit 11 changes the projection condition more flexibly using the projection image data by, for example, setting the projection speed to 2 times so as to urge the user to guide the user more, or changing the projection condition so as to increase the projection size.
In addition, when it is determined whether or not the guidance of the user is successful, if it is determined by the guidance result of 1 user, the animation display may be excessively changed. Therefore, it is considered that the control unit 11 changes the projection conditions based on the movement result information of the plurality of users or the movement result information in a fixed time period, and changes the projection image data in a flexible manner. For example, when the guidance result of 10 consecutive users fails, the projection speed may be 2 times, or when the guidance result of 5 or more users fails in 1 hour, the projection size may be 2 times.
Next, a specific example of the animation display in which the fluidity changes will be described.
Example 1 example of guiding a user to a desired gate at an airport
Fig. 8 is a diagram showing a series of image data used in specific example 1 in embodiment 1 of the present invention. The 10 partial images IMG (1) to IMG (10) are displayed in order, thereby performing animation display.
The partial images IMG (1) to IMG (3) shown in the upper part are 3 pieces of image data at the start of animation. Partial images IMG (4) to IMG (7) shown in the middle are 4 pieces of image data in animation execution. The partial images IMG (8) to IMG (10) shown in the lower part are 3 pieces of image data at the end of the animation.
Fig. 9 is a diagram showing an example of a fluidity change of animation display based on the display parameter, the environmental parameter, and the user parameter using the method 2 with respect to the series of image data shown in fig. 8. Consider the situation in which a user wants to be guided to a door 4 at an airport because the door 4 is free.
The partial images IMG (1) to IMG (10) for performing the animation display shown in fig. 9 are roughly classified into partial images of a group a consisting of a-1 to a-5, a group B consisting of B-1, a group C consisting of C-1, and a group D consisting of D-1, and a change in fluidity is given to each partial image of the partial images.
The group a is an element for changing the display processing based on the degree of change determined from the environmental parameter in the ambient environmental data. In the example of FIG. 9, the display of the character portion of "GATE 4" and the graphic portion constituted by the arrow curved in the right direction is repeated in 5 patterns of A-1 to A-5, thereby performing animation display. Then, as the animation display for urging the door 4 to move, the control unit 11 can determine the display speed of each of a-1 to a-5 as follows based on the second parameter corresponding to the environmental parameter.
A-1: set to 10m/s, and display slowly
A-2: set to 50m/s, quickly display
A-3: shows 1s
A-4: set to 10m/s, and display slowly
A-5: set to 50m/s, quickly display
Next, the groups B and C are partial images in which the display processing is changed based on the degree of change determined based on the user parameter in the user information data. In the example of FIG. 9, the display of the graphic portion constituted by a plurality of arrows facing the right is repeated in the B-1 pattern, and the display of the text portion of "FAST" is repeated in the C-1 pattern, thereby performing animation display. For example, when the degree of change is given by using the user's moving speed as the user parameter, using "a" for a case where the moving speed is slow, and "B" for a case where the moving speed is fast, the control unit 11 can cause each of B-1 and C-1 to be displayed under the following conditions.
B-1: in case of a, still display
In the case of b, the display is set to 50m/s, and the display is speeded up
C-1: in case of a, there is no display
In the case of b, the zoom speed is set to 25%/s, and the zoom-in/zoom-out is repeated to display
Next, the D group is a partial image determined based on the display parameters in the fixed instruction data. For example, when the display parameters are given by assuming that a is a case where the animation display is to be enlarged and b is a case where the animation display is to be reduced, the control unit 11 can cause D-1 to be displayed under the following conditions.
D-1: in the case of a, the projection size is set to 2m × 2m
In the case of b, the projection size is set to 1m × 1m
Fig. 10 and 11 are diagrams showing a state in which the animation display described in fig. 8 and 9 is performed. Fig. 10 shows a display example in a case where it is desired to guide the user to the door 4 because the door 1 is crowded and the door 4 is vacant. Fig. 11 shows a display example in a case where the user is guided to both of the doors 1 and 4 as compared with fig. 10.
Further, the setting of the display parameters to be guided to the gate4 as shown in fig. 10 or to both the gate 1 and the gate4 as shown in fig. 11 is changed by the airport administrator. Then, the control unit 11 can perform appropriate animation display according to the situation by changing the display processing of each portion image in a fluid manner as shown in fig. 9 according to the setting content based on the display parameter, the environment parameter, and the user parameter.
Example 2 example of guiding a user to a desired place at an event venue at night
Fig. 12 is a diagram showing a state in which animation is displayed at an event venue at night. In fig. 12, the guidance destinations of the 4 event venues a to D are animated. Here, a case is considered in which activity D corresponding to the rightmost display in fig. 12 is currently being performed and the user is to be guided to the place of activity D. In this case, the control unit 11 can stop the moving image display corresponding to the events a to C, display only the moving image display corresponding to the event D while slowly moving, and display the event name of D together.
Here, the display parameters are set by the manager of the event venue as to which event the user is to be guided. Therefore, the control unit 11 acquires an image for displaying as shown in fig. 12 from the image database 13 in accordance with the setting content of the display parameter.
The control unit 11 can perform appropriate animation display according to the situation by changing the display processing of each image in a flexible manner based on the parameters set by the environment parameters and the user parameters.
Note that the late activity is described here, but the late activity is not limited thereto. For example, in the case of indoor activities in the daytime or in the case of projection in which the color temperature of the projected image is increased so that the projected image can be easily seen outdoors even in the daytime, the present invention can be used for outdoor activities in the daytime.
[ other specific examples ]
For example, there are the following cases: in the station, it is desired to display the route to each platform as a moving image. In such a case, the display location can be switched depending on the degree of congestion so that the display location is displayed on the ceiling in the case of congestion and on the floor in the case of non-congestion. The control unit 11 can change the display location according to the setting of the display parameter by the station administrator or according to a value indicating the degree of congestion that can be acquired as an environmental parameter.
In addition, when applying the animation display according to embodiment 1 to the display with a road in the theme park, it is considered that the atmosphere of the theme park is not obstructed. For example, it is conceivable to use the projection image data to change smoothly so as not to perform animation display using excessive speed change or animation display using scaling.
The functions of the moving image display device according to embodiment 1 are realized by a processing circuit. The processing circuit for realizing each function may be dedicated hardware or a processor for executing a program stored in a memory. Fig. 13 is a configuration diagram showing a case where each function of the moving image display device according to embodiment 1 of the present invention is realized by a processing circuit 1000 which is dedicated hardware. Fig. 14 is a configuration diagram showing a case where each function of the moving image display device according to embodiment 1 of the present invention is realized by a processing circuit 2000 including a processor 2001 and a memory 2002.
When the processing Circuit is dedicated hardware, a single Circuit, a composite Circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a Circuit in which a combination of these circuits is used corresponds to the processing Circuit 1000. The functions of the respective units in the control unit 11 and the projection unit 12 may be realized by the processing circuit 1000 separately, or the functions of the respective units may be realized by the processing circuit 1000 in a lump.
On the other hand, in the case where the processing circuit is the processor 2001, the functions of each part of the control unit 11 and the projection unit 12 are realized by software, firmware, or a combination of software and firmware. Software and firmware are described as programs and are stored in the memory 2002. The processor 2001 realizes the functions of each section by reading out and executing a program stored in the memory 2002. That is, the animation display device includes a memory 2002, and the memory 2002 stores a program that causes the processing circuit 2000 to execute the steps shown in fig. 3, 5, and 7 as an example when the program is executed.
These programs can also be said to be programs that cause a computer to execute the processes or methods of the respective sections described above. Here, a nonvolatile or volatile semiconductor Memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash Memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory), or the like corresponds to the Memory 2002. Further, a magnetic disk, a flexible disk, an optical disk, a high-density magnetic disk, a mini disk, a DVD, and the like also correspond to the memory 2002.
The functions of the above-described units may be partly implemented by dedicated hardware and partly implemented by software or firmware.
In this way, the processing circuit can realize the functions of the above-described respective sections by hardware, software, firmware, or a combination thereof.
As described above, according to embodiment 1, there is provided a configuration for performing animation display by flexibly changing the display processing of an animation image based on a display parameter, an environment parameter, and a user parameter. That is, the animation display device according to embodiment 1 can perform animation display by changing the display processing of the animation image in fluidity using the fluidity parameter as a quantitative index indicating the degree of change.
In addition, the guidance destination can be easily switched according to the display parameter set and changed by the administrator of the display location. As a result, it is possible to perform animation display in which the moving direction can be easily recognized for the user. Further, the facility manager can guide the user to a desired place according to the situation.

Claims (9)

1. A moving picture display device is provided with:
an image database storing image data constituting animation display; and
a control unit for performing the animation display,
the control unit receives a fluidity parameter set to an environmental parameter that can be collected at a location where the animation display is performed and a user parameter that can be collected in correspondence with a user who visually recognizes the animation display,
the control section receives a non-fluidity parameter set in advance as a display parameter,
the control unit performs the animation display by flexibly changing a display process of the image data acquired from the image database based on the fluidity parameter and the non-fluidity parameter.
2. The animation display device according to claim 1,
in the image database, the image data constituting the animation display is stored as a series of image data associated with the display parameters,
the control unit acquires the display mode and the first parameter corresponding to the input of the display parameter from a fixed instruction database in which fixed instruction data in which the display parameter, the display mode, and the first parameter indicating the priority of guidance displayed in the animation are associated with each other is stored in advance,
the control unit acquires a series of image data corresponding to the display mode from the image database as projection image data,
the control unit acquires, as a second parameter, a degree of change in accordance with an input of the environmental parameter from a surrounding environment database in which surrounding environment data in which the environmental parameter and the degree of change at the time of the animation display are associated with each other are stored in advance,
the control unit acquires, as a third parameter, a degree of change according to an input of the user parameter from a user information database in which user information data in which the user parameter and a degree of change at the time of the animation display are associated with each other is stored in advance,
the control unit performs the animation display by changing the projection image data in a fluid manner according to the first parameter, the second parameter, and the third parameter.
3. The animation display device according to claim 1,
in the image database, the image data constituting the animation display is stored as image data of each portion associated with at least any one of the display parameter, the environment parameter, and the user parameter,
the control unit acquires the first display mode and the first parameter according to an input of the display parameter from a fixed instruction database in which fixed instruction data in which the display parameter, the first display mode, and the first parameter indicating a priority of guidance displayed in the animation are stored in advance in association with each other,
the control unit acquires a second display mode corresponding to an input of the environmental parameter from a surrounding environment database in which the environmental parameter, a second display mode, and surrounding environment data associated with a degree of change when the animation is displayed are stored in advance, and acquires a degree of change corresponding to the input of the environmental parameter as the second parameter,
the control unit acquires a second display mode corresponding to an input of the user parameter from a user information database in which the user parameter, a third display mode, and user information data in which a degree of change at the time of the animation display is associated with each other are stored in advance, and acquires a degree of change corresponding to the input of the user parameter as the third parameter,
the control unit acquires, as image data for projection, image data for each portion corresponding to each of the first display mode, the second display mode, and the third display mode from the image database,
the control unit performs the animation display by changing, in a fluid manner, an image of each of the respective portions included in the projection image data in accordance with the first parameter, the second parameter, and the third parameter.
4. The animation display device according to claim 1,
in the image database, the image data constituting the animation display is stored as image data constituted by a combination of a base image that is fixedly displayed and a plurality of guide images that are sequentially switched to be displayed for the animation display,
the control unit acquires the first parameter corresponding to the input of the display parameter from a fixed instruction database in which fixed instruction data in which the display parameter and the first parameter indicating the base image to be selected are associated with each other is stored in advance,
the control unit acquires, as a second parameter, a degree of change corresponding to an input of the environmental parameter and the first parameter from a surrounding environment database in which the environmental parameter, the first parameter, and surrounding environment data in which the degree of change at the time of the animation display is associated with each other are stored in advance,
the control unit acquires, as a third parameter, a degree of change according to an input of the user parameter from a user information database in which user information data in which the user parameter and a degree of change at the time of the animation display are associated with each other is stored in advance,
the control unit generates projection image data by acquiring a base image corresponding to the first parameter from the image database, acquiring a guide image corresponding to the second parameter from the image database, and combining the acquired base image and the guide image,
the control unit performs the animation display by changing the guide image included in the projection image data in a fluid manner according to the third parameter.
5. The animation display device according to any one of claims 2 to 4,
the environment parameter, which is a component of the ambient environment data, includes at least one of a degree of congestion of a predetermined place in a system use environment, weather of a place where the animation display is performed, illuminance, a material of a projection destination, and a color temperature of the projection destination.
6. The animation display device according to any one of claims 2 to 5,
the user parameter, which is a component of the user information data, is configured to include at least one of data relating to a movement speed of the user and a physical feature of the user, which are displayed in the animation.
7. The animation display device according to claim 2 or 3,
the display mode, which is a component of the fixed instruction data, includes at least one of data relating to a projection size and a projection standard speed at the time of the moving image display.
8. The animation display device according to any one of claims 2 to 7,
the control section acquires movement result information of the user as a result of execution of the animation display,
the control unit performs the animation display by further fluidity-changing the projection image data in accordance with the movement result information.
9. An animation display method includes the steps of:
receiving, as a liquidity parameter, an environmental parameter that can be collected at a place where animation display is performed and a user parameter that can be collected in correspondence with a user who visually recognizes the animation display;
receiving a non-liquidity parameter preset as a display parameter;
acquiring image data for performing the animation display from an image database storing image data constituting the animation display based on the fluidity parameter and the non-fluidity parameter; and
the animation display is performed by changing the display processing of the acquired image data in a fluidity manner based on the fluidity parameter and the non-fluidity parameter.
CN201880100035.9A 2018-12-13 2018-12-13 Animation display device and animation display method Pending CN113170078A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/045873 WO2020121479A1 (en) 2018-12-13 2018-12-13 Animation display device and animation display method

Publications (1)

Publication Number Publication Date
CN113170078A true CN113170078A (en) 2021-07-23

Family

ID=71075730

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880100035.9A Pending CN113170078A (en) 2018-12-13 2018-12-13 Animation display device and animation display method

Country Status (3)

Country Link
JP (1) JP6833139B2 (en)
CN (1) CN113170078A (en)
WO (1) WO2020121479A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006209550A (en) * 2005-01-28 2006-08-10 Brother Ind Ltd Information providing device, information providing system, and vending machine
CN101604491A (en) * 2009-07-06 2009-12-16 北京派瑞根科技开发有限公司 Electronics with environment and expression shape change is drawn
JP2012049850A (en) * 2010-08-27 2012-03-08 Casio Comput Co Ltd Image display unit, image data conversion method, image distribution system, and program
JP2017097749A (en) * 2015-11-27 2017-06-01 株式会社トーメンエレクトロニクス Information providing device and information providing system
CN107274815A (en) * 2017-05-31 2017-10-20 泉州创先力智能科技有限公司 A kind of control method of Window Display system
CN107911918A (en) * 2017-10-30 2018-04-13 深圳磊迈照明科技有限公司 A kind of video broadcasting method based on the environmental parameter interaction of light animation
WO2018138842A1 (en) * 2017-01-26 2018-08-02 三菱電機株式会社 Irradiation control device and irradiation method
CN108632594A (en) * 2018-07-17 2018-10-09 王锐 A kind of intelligence commodity information display system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6023280B1 (en) * 2015-07-09 2016-11-09 株式会社リクルートホールディングス Congestion situation estimation system and congestion situation estimation method
WO2018084191A1 (en) * 2016-11-07 2018-05-11 株式会社日立国際電気 Congestion state analysis system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006209550A (en) * 2005-01-28 2006-08-10 Brother Ind Ltd Information providing device, information providing system, and vending machine
CN101604491A (en) * 2009-07-06 2009-12-16 北京派瑞根科技开发有限公司 Electronics with environment and expression shape change is drawn
JP2012049850A (en) * 2010-08-27 2012-03-08 Casio Comput Co Ltd Image display unit, image data conversion method, image distribution system, and program
JP2017097749A (en) * 2015-11-27 2017-06-01 株式会社トーメンエレクトロニクス Information providing device and information providing system
WO2018138842A1 (en) * 2017-01-26 2018-08-02 三菱電機株式会社 Irradiation control device and irradiation method
CN107274815A (en) * 2017-05-31 2017-10-20 泉州创先力智能科技有限公司 A kind of control method of Window Display system
CN107911918A (en) * 2017-10-30 2018-04-13 深圳磊迈照明科技有限公司 A kind of video broadcasting method based on the environmental parameter interaction of light animation
CN108632594A (en) * 2018-07-17 2018-10-09 王锐 A kind of intelligence commodity information display system and method

Also Published As

Publication number Publication date
WO2020121479A1 (en) 2020-06-18
JP6833139B2 (en) 2021-02-24
JPWO2020121479A1 (en) 2021-03-11

Similar Documents

Publication Publication Date Title
JP7480823B2 (en) Information processing device, information processing method, and program
US9633479B2 (en) Time constrained augmented reality
EP2323103A1 (en) Detection information registration device, object detection device, electronic device, method for controlling detection information registration device, method for controlling object detection device, program for controlling detection information registration device, and program for controlling object detection device
US20160255282A1 (en) Interactive surveillance overlay
KR102355135B1 (en) Information processing device, information processing method, and program
CN112042184A (en) Image processing device, image processing system and image processing method
CN109961458B (en) Target object tracking method and device and computer readable storage medium
US9886793B1 (en) Dynamic video visualization
KR101645959B1 (en) The Apparatus and Method for Tracking Objects Based on Multiple Overhead Cameras and a Site Map
JP2011109428A (en) Information processing apparatus, information processing method, and program
JP2016224919A (en) Data browsing device, data browsing method, and program
CN111523390B (en) Image recognition method and augmented reality AR icon recognition system
JP2022520512A (en) Data push methods, devices, electronic devices, computer storage media, and computer programs
JP2017090965A (en) Crowd classification device, method thereof and program thereof
KR20140058192A (en) Control image relocation method and apparatus according to the direction of movement of the object of interest
EP3062506B1 (en) Image switching method and apparatus
KR101825600B1 (en) Apparatus and method of guiding designated seat based on augmented reality technique
EP3151243B1 (en) Accessing a video segment
JP4110323B2 (en) Information output method and apparatus, program, and computer-readable storage medium storing information output program
CN110244923B (en) Image display method and device
KR102238790B1 (en) Method for providing content combined with viewing route of exhibit
CN113170078A (en) Animation display device and animation display method
US20080309672A1 (en) System and method for digital video scan using 3-d geometry
US10635925B2 (en) Method and system for display the data from the video camera
US20210014458A1 (en) Entity analysis and tracking in a surveillance system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210723