US20080198178A1 - Providing area zoom functionality for a camera - Google Patents

Providing area zoom functionality for a camera Download PDF

Info

Publication number
US20080198178A1
US20080198178A1 US12/029,758 US2975808A US2008198178A1 US 20080198178 A1 US20080198178 A1 US 20080198178A1 US 2975808 A US2975808 A US 2975808A US 2008198178 A1 US2008198178 A1 US 2008198178A1
Authority
US
United States
Prior art keywords
perimeter
interest
area
image
original image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/029,758
Other languages
English (en)
Inventor
Fred Julin
Martin Nilsson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Axis AB
Original Assignee
Axis AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Axis AB filed Critical Axis AB
Priority to US12/029,758 priority Critical patent/US20080198178A1/en
Assigned to AXIS AB reassignment AXIS AB ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JULIN, FRED, NILSSON, MARTIN
Publication of US20080198178A1 publication Critical patent/US20080198178A1/en
Priority to US15/224,798 priority patent/US9774788B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation

Definitions

  • the present invention relates to a method, in a computer system, for adjusting a camera, a computer program product, and a storage medium.
  • the present invention is intended for use in connection with digital cameras, such as pan/tilt/zoom digital cameras that are often used in various types of surveillance applications or video conferencing systems.
  • digital cameras such as pan/tilt/zoom digital cameras that are often used in various types of surveillance applications or video conferencing systems.
  • One example of such a camera is described in U.S. Pat. No. 5,528,289, which describes a method for automatically adjusting a videoconferencing system camera to center an object.
  • a camera provides a digital image to a user.
  • the image is displayed on a monitor display, such as a computer screen, using some kind of application software.
  • the software allows the user to draw a rectangle around an object or area of interest in the displayed image, using a pointer.
  • the camera is automatically positioned so as to center the object in the monitor display and adjust the zoom and focus so that the designated area in the rectangle fills the display.
  • the rectangle is drawn by the user by placing the pointer at a desired pointer starting point (PSP), corresponding to a corner of the rectangle, dragging the pointer diagonally across the area of interest, and releasing the mouse button at a pointer ending point (PEP), corresponding to the corner of the rectangle diagonally across from the PSP.
  • PSP pointer starting point
  • PEP pointer ending point
  • a controller calculates a center point of the rectangle, based on the PEP and the PSP.
  • the controller calculates a difference between the calculated center point and the midpoint of the monitor display, to determine the pan and tilt of the camera necessary to center the desired picture on the monitor display.
  • the controller performs a set of calculations to determine how much to zoom to the new field of view, and instructs the camera to focus, either by a focus process or by a memory process, which ends the procedure.
  • a problem associated with the above approach arises when a user attempts to zoom in on a particular object of interest. Since the rectangle is drawn with the PSP and the PEP in two opposite corners of the rectangle, the likelihood that the user will be able to optimally include the object of interest within the drawn frame is rather low. Often too little or too much of the object of interest is displayed after the zoom, causing the user to repeat the drawing process several times.
  • U.S. Pat. No. 6,052,110 describes a method for zooming in or zooming out objects that are displayed on a display screen.
  • the user selects a “zoom operation” from a toolbar menu. This causes a cursor to appear on the screen.
  • the user positions the cursor to set a reference point, for example by clicking a mouse button.
  • This causes a reference point indicator to be displayed on the screen.
  • the reference point indicator in the illustrated embodiment is located on the perimeter of a circle. After the reference point has been set, the user can move the cursor away from the reference point.
  • a zoom out operation occurs, and when the cursor is moved outside the perimeter of the circle, a zoom in operation occurs.
  • the distance between the cursor position and the reference point determines the speed of the zoom operations. No region of interest is specified, and there is no feedback to a camera in this method.
  • the invention relates to a method, in a computer system, for adjusting a camera.
  • the method includes the steps of
  • Defining a center point makes it possible for the user to specify exactly what portion of the area of interest will be in the center of the zoomed image, and thus to optimally include an object of interest within the zoomed image. Furthermore, since the area of interest is set to the same aspect ratio as the original image, the drawn perimeter lets the user know exactly what portion of the original image will be visible in the zoomed image. This is in contrast to the prior art applications, in which the scaling can occur along different dimensions of the rectangle and a portion of the defined region of interest may not be shown in the zoomed image (or alternatively, more than the defined area of interest may be shown in the zoomed image).
  • the invention relates to a computer program for adjusting a camera.
  • the computer program includes instructions corresponding to the steps of
  • the invention relates to a digital storage medium comprising such a computer program.
  • the computer program and the storage medium involve advantages corresponding to those of the method and may be varied similarly.
  • FIG. 1 shows a schematic view of a camera system in accordance with one preferred embodiment of the invention.
  • FIG. 2 shows a flowchart illustrating a process for adjusting a digital camera in accordance with one preferred embodiment of the invention.
  • FIG. 3 shows a first original image from a digital camera in accordance with one preferred embodiment of the invention.
  • FIG. 4 shows a first zoomed image of a region of interest in the first original image in accordance with one preferred embodiment of the invention.
  • FIG. 5 shows a second original image from a digital camera in accordance with one preferred embodiment of the invention.
  • FIG. 6 shows a second zoomed image of a region of interest in the second original image in accordance with one preferred embodiment of the invention.
  • FIG. 7 shows a schematic view of how the dimensions of the perimeter of an area of interest are determined in accordance with one preferred embodiment of the invention.
  • receiving the first user input includes receiving a user selection with a pointing device of a point in the original image.
  • the pointing device can be, for example, a computer mouse or a joystick or other similar device. This enables the user to easily and exactly specify the center point of the area of interest.
  • the point is defined by an (x, y) coordinate pair in an orthogonal coordinate system for the original image. This makes it easy to record the value of the center point and transmit the coordinates to the digital camera for repositioning. If the coordinate system is displayed along with the original image on the display screen, it also allows the user to specify the point using some kind of non-graphical input device, such as using a keyboard or voice control to specify the coordinate values for the point.
  • receiving the second user input includes receiving a second location at a distance from the center point. This enables the user to quickly specify the perimeter of the area of interest, since the center point is known from the first user input, and the aspect ratio of the perimeter is known from the aspect ratio of the original image.
  • the second location is defined by a user using a pointing device in a click-and-drag operation starting at the center point.
  • a pointing device in a click-and-drag operation starting at the center point.
  • various types of pointing devices can be used.
  • a click-and-drag operation that is, depressing a mouse button at the center point, holding the mouse button depressed while dragging the pointer to the desired location of the perimeter, and releasing the mouse button when the desired perimeter is reached, it may be easier for a user to control the location of the perimeter and thus give the user better control of optimally defining the area of interest.
  • the distance between the center point and the second location is equal to half the distance between two diagonally opposite corners on the perimeter. That is, the user does not need to select the second location to be a location on the perimeter of the area of interest. On the contrary, the user can select the second location at any point and in any direction from the center point and the perimeter will be drawn based on the distance between the center point and the second location, with the proper aspect ratio.
  • this can be thought of as selecting the second location on the perimeter of an imaginary circle centered in the center point and drawing the perimeter for the area of interest as a rectangle inscribed in this imaginary circle. Not being limited to defining the second location as a point on the perimeter for the area of interest creates great flexibility and ease of use for the user.
  • drawing the perimeter of the area of interest occurs only when the second location is at a minimum distance from the center point.
  • a minimum threshold it is possible to avoid unintentional zooming in applications that also have the ability to center an object by clicking on the object. That is, if the user's intention were to center the object by clicking on the object, and the user unintentionally moved the mouse a small amount while holding down the mouse button, no zooming would occur.
  • the user intended to draw a rectangle specifying an area of interest to zoom in on an object the user would move the mouse to a second location that exceeds the minimum threshold.
  • the minimum distance is defined by a number of pixels in the original image.
  • the number of pixels can be defined arbitrarily, for example, as a fixed number based on the sensitivity of the input device to user movements, or based on a particular user's motor skills, so that users having various degrees of motor skills can properly use the application interface.
  • the minimum distance is defined by a zooming-capability of the camera. This enables accurate control of the zooming, since the user cannot define a perimeter of an area of interest that is too small and exceeds the digital camera's zooming capabilities. Furthermore, it allows the same user interface to be used for a wide range of digital cameras for which the zooming capability may not always be known. This can be very beneficial, for example, when a single user monitors several locations using a web interface.
  • a zoom factor is calculated as the ratio of a diagonal distance between two corners of the area of interest and a diagonal distance between two corners of the original image, and providing the calculated zoom factor to the camera for use in adjusting the camera. Calculating the zoom factor in this manner provides a single value to send to the digital camera, so that the camera can adjust to the proper zoom level, and requires very little processing power and storage space.
  • a zoom factor smaller than “1” that is, less than 100%
  • a zoom factor equal to “1” corresponds to no zooming
  • a zoom factor larger than “1” that is, more than 100%
  • At least a portion of the perimeter of the area of interest can extend outside the perimeter of the original image. This is typically the case when an object of interest that the user wishes to zoom in is located close to or at the edge of the original image.
  • By extending the area of interest outside the image it is possible to see details in the zoomed image that were not visible in the original image. This is not possible in prior art implementations, since the rectangular frame is drawn starting and ending in two corners of the rectangle, which means that the entire region of interest must be visible in the original image. As the skilled reader realizes, this applies both to zoomed-in images and to zoomed-out images.
  • the area of interest includes the entire original image and the zoomed image is a zoomed-out image.
  • This enables the user to perform similar operations for zooming out and zooming in, which creates an intuitive and easy to understand user interface.
  • no separate on-screen controls are needed for zooming in and zooming out, which can save valuable real estate space on the display screen that can instead be used to show images. This is particularly important for display screens on mobile devices, which are generally smaller than the display screens on stationary devices, or in situations when multiple images are simultaneously shown on a single display.
  • a system 100 in accordance with one preferred embodiment of the invention includes a digital camera 120 , which is illustrated in a basic modular form.
  • the digital camera 120 is arranged to produce one or several digital image(s) of an object 110 , which can be any physical object that is present in a scene optically covered by the digital camera 120 .
  • the digital camera 120 has an optical input in the form of a lens or objective 130 .
  • the objective 130 is optically coupled to an image capturing unit 140 , which is provided with appropriate means for producing a digital image representative of the object 100 .
  • the image capturing unit 140 includes a Charge Coupled Device element (CCD), which is well-known a person of ordinary skill in the art.
  • CCD Charge Coupled Device element
  • the image capturing unit 140 is coupled to a controller 150 and a memory 160 .
  • the controller 150 controls the image capturing unit 140 .
  • the controller 150 is also operatively connected to the digital memory 160 for storing images captured by the image capturing unit 140 .
  • the memory 160 can be implemented by any commercially available memory, such as an SRAM memory.
  • the digital camera 120 can be connected to a wired or wireless network 170 , such as an Ethernet or Token Ring network, which in turn can be part of the Internet.
  • the controller 150 of the digital camera 120 is provided with appropriate software for allowing the digital camera 120 to act as a network camera available on the network 170 , that is, including a video server that can produce digital images.
  • the pan and/or tilt angle and zoom of the digital camera 120 can be set and changed by a user of the digital camera 120 by accessing the controller 150 through the network 170 , using a computer 180 or other type of device capable of communicating with the controller 150 over the network 170 .
  • the pan and/or tilt angle(s) and zoom can be controlled from a computer directly connected to the digital camera 120 .
  • FIG. 2 shows a process 200 for adjusting the digital camera 120 in accordance with a preferred embodiment of the invention.
  • the process 200 starts by receiving an original image from the camera 120 and displaying it to a user on the computer display screen 180 , step 210 .
  • An example of an original image 300 can be seen in FIG. 3 .
  • the original image 300 has an aspect ratio that is defined by the ratio of the longer side of the original image 300 and the shorter side of the original image 300 .
  • the resolution of the original image 300 typically depends on the settings in the software that displays the original image 300 and/or the settings in the digital camera 120 .
  • the mechanisms for selecting image resolutions are well known to those of ordinary skill in the art, and will not be discussed here.
  • Typical image resolutions are 640 by 480 pixels for VGA systems, and 704 by 576 pixels or 704 by 480 pixels for PAL systems.
  • a first input is received from a user, which defines a center point of a region of an area of interest, step 210 .
  • the user input is received in the form of a mouse click at a particular point of the image that the user would like to center in the zoomed image that is to be generated.
  • the input can be provided using any other type of graphical input device, such as a joystick.
  • the input can also be provided by a conventional keyboard or through voice control. These input mechanisms are particularly useful where the point can be easily identified by a coordinate pair.
  • the center point 310 identified by the user corresponds to the license plate of the white van on the right hand side of the original image 300 .
  • a second user input defining a perimeter of the area of interest is received, step 230 .
  • the perimeter 320 defines an area of interest that will be shown in the zoomed image after the digital camera has adjusted its zoom level and pan/tilt positions.
  • the area of interest has the same aspect ratio as the original image 300 . That is, the ratio of the long side and the short side of the perimeter 320 is identical to the ratio of the long side and the short side of the original image 300 .
  • the second user input is received in the form of a click-and-drag operation using a computer mouse, similar to what is done in conventional applications, and what is well-known to those of ordinary skill in the art.
  • the click and drag operation starts at the center point 310 and not in one of the corners of the perimeter 320 for the area of interest. This allows the user to more accurately center an object of interest, such as the license plate in the original image 300 .
  • the perimeter 320 is drawn as the user drags the computer mouse out from the center towards the perimeter.
  • the click-and-drag operation can be thought of as dragging the pointer outwards from the center of an imaginary circle along the circle's radius.
  • the application software draws a perimeter 320 in the shape of a rectangle.
  • the rectangle has the same aspect ratio as the original image 300 and is inscribed in the imaginary circle.
  • the ending point of the click-and-drag operation does not have to be a point on the perimeter 320 . Instead, the distance between the starting point and ending point of the click-and-drag operation is used to compute the dimensions and location of the perimeter 320 .
  • FIG. 7 shows a schematic example of such an embodiment and how the dimensions of the perimeter 320 are computed.
  • a display screen 700 has the horizontal dimension xc and the vertical dimension yc.
  • the user selects a center point 310 for the area of interest with the coordinates (x0, y0), depresses the mouse button, and moves the pointer along a radius r 710 to an endpoint 720 on the perimeter of an imaginary circle 730 .
  • This provides sufficient information for calculating the coordinates of two diagonally opposite corners of the perimeter, located on the perimeter of the imaginary circle 730 , and keeping in mind that the perimeter 320 has the same aspect ratio as the display screen 700 .
  • the upper left corner 740 has the coordinates (x1, y1)
  • the lower right corner 750 has the coordinates (x2, y2). These coordinates are calculated as follows.
  • the perimeter 320 is not drawn until the user releases the mouse button after completing the dragging.
  • the user does not have to click-and-drag, but can simply click a first time at the center point 310 and a second time at a location where the user desires to place the perimeter 320 .
  • Some embodiments may contain a separate “center object function,” which causes the camera to center on an object that is selected by the user, without zooming in on the object, similar to the first user input described above.
  • some embodiments do not allow the perimeter 320 to be drawn until the cursor is at a minimum distance from the selected center point 310 .
  • some embodiments draw the perimeter 320 regardless of the distance from the center point 310 , but draw the perimeter 320 in varying colors. For example, a red perimeter 320 may be drawn while the cursor is too close to the center point 310 , and the color of the perimeter 320 may switch to green when the cursor is further away from the center point 310 than the specified minimum distance.
  • this minimum distance between the center point 310 and the perimeter 320 is defined in terms of pixels, for example, an area of 8 ⁇ 8 pixels surrounding the center point 310 , in the original image 300 .
  • the minimum distance is defined based on the zooming capabilities of the camera. For example, when the user defines an area of interest that is so small that the camera would not be able to zoom in to the level desired by the user, no perimeter 320 would be drawn. Instead, the camera may do nothing or perform a center object operation. This approach is particularly useful when multiple cameras are controlled through the same interface, for example, in the case of a web interface for controlling several cameras.
  • the amount of zooming needed is computed by calculating the ratio of the diagonal distance between two opposite corners on the perimeter 320 , and the diagonal distance between two opposite corners on the original image 300 .
  • the result gives a value greater than 0, where a value smaller than 1 corresponds to zooming in, a value larger than 1 corresponds to zooming out, and a value of 1 corresponds to no zooming.
  • This value is often referred to as the zoom factor.
  • this value can also be expressed as a percentage value, where a value smaller than 100% corresponds to zooming in, a value larger than 100% corresponds to zooming out, and a value of 100% corresponds to no zooming. Having a single value for a zoom factor makes it easy to send the zoom data to the camera, and also makes it easy to determine whether the desired zooming in is within the cameras capabilities, as discussed in the previous paragraphs.
  • the data is sent to the controller 150 in the camera 120 .
  • the controller 180 adjusts the camera view to tilt/pan/zoom, either through physically adjusting the camera 120 or through digital image processing, to the desired image view in accordance with methods that are well known to those of ordinary skill in the art, step 240 .
  • the tilt/pan operations and the zoom operation can occur simultaneously as part of a single operation or occur as separate steps.
  • the data that is sent from the computer 180 to the controller 150 includes an (x, y) coordinate pair for the center point 310 , and the zoom factor, determined as described above.
  • FIG. 4 shows an example of a zoomed image 400 corresponding to the area of interest within the perimeter 320 in the original image 300 .
  • the zoomed image 400 is shown at the same size and dimensions as the original image 300 .
  • the license plate which was chosen as the center point 230 , is centered and clearly visible in the zoomed image 400 .
  • zooming out can be accomplished by defining a perimeter 230 around an area of interest that is larger than the area of original image 300 .
  • the area of interest can include all or the original image, or a portion of the original image, for example, if the center point is placed close to an edge of the original image 300 and the area of interest is selected to be larger than the area of the original image 300 .
  • the zoom factor will have a value that is greater than 100%. Similar to the case of zooming in, it is possible to define a threshold value for when to draw the perimeter 230 .
  • this threshold value can be connected to the zooming capability of the digital camera 120 .
  • no perimeter 230 may be drawn, or the perimeter 230 may be drawn with a different color, analogous to the above discussion for the zooming in case.
  • This method of zooming out is especially advantageous since it is intuitive to use for a user who is already familiar with the zooming-in methods described above.
  • it is not necessary to have any on-screen controls for zooming in and zooming out which may be especially useful when display screen real estate is scarce, for example, when there are several pictures on a display screen or when a picture fills the entire display screen, which often may be the case in smaller, portable displays.
  • FIG. 5 shows such a case, in which a user would like to zoom in on a white van 510 located at the left edge of the original image 500 .
  • the white van 510 is selected as the center point, and the user performs a click-and-drag operation to define the perimeter 520 of the area of interest, which will be shown in the zoomed image.
  • the perimeter 520 is drawn within the original image 500 .
  • the rest of the perimeter 520 is located outside the original image 500 and is thus not drawn.
  • FIG. 6 shows the resulting zoomed image 600 of the area of interest defined in the original image 500 .
  • the white van is located approximately at the center of the zoomed image 600 and another car and flagpole, which were not visible in the original image 500 can now clearly be seen.
  • the user would first have to center the original image, so that the entire area to be zoomed in on would be visible, then draw a rectangle defining the area of interest, by starting and ending in two corners of the rectangle.
  • the embodiment shown in FIGS. 5 and 6 greatly simplifies the user manipulations and improves the accuracy compared to the prior art applications.
  • the invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
  • Apparatus of the invention can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps of the invention can be performed by a programmable processor executing a program of instructions to perform functions of the invention by operating on input data and generating output.
  • the invention can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
  • Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language.
  • Suitable processors include, by way of example, both general and special purpose microprocessors.
  • a processor will receive instructions and data from a read-only memory and/or a random access memory.
  • a computer will include one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
  • Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM disks. Any of the foregoing can be supplemented y, or incorporated in, ASICs (application-specific integrated circuits).
  • semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
  • magnetic disks such as internal hard disks and removable disks
  • magneto-optical disks magneto-optical disks
  • CD-ROM disks CD-ROM disks
  • the invention can be implemented on a computer system having a display device such as a monitor or LCD screen for displaying information to the user.
  • the user can provide input to the computer system through various input devices such as a keyboard and a pointing device, such as a mouse, a trackball, a microphone, a touch-sensitive display, a transducer card reader, a magnetic or paper tape reader, a tablet, a stylus, a voice or handwriting recognizer, or any other well-known input device such as, of course, other computers.
  • the computer system can be programmed to provide a graphical user interface through which computer programs interact with users.
  • the computer system can also be programmed to provide a “pre-recorded tour” representing a command sequence that has been recorded by a user at an earlier point in time, using any of the above mentioned input devices. This recorded tour may include the zooming functionality discussed above.
  • the processor optionally can be coupled to a computer or telecommunications network, for example, an Internet network, or an intranet network, using a network connection, through which the processor can receive information from the network, or might output information to the network in the course of performing the above-described method steps.
  • the network can be any combination of wired or wireless network, as is familiar to those of ordinary skill in the computer hardware and software arts.
  • the information that is received over the network is often represented as a sequence of instructions to be executed using the processor and may be received from and outputted to the network, for example, in the form of a computer data signal embodied in a carrier wave.
  • the above-described devices and materials will be familiar to those of skill in the computer hardware and software arts.
  • the present invention employs various computer-implemented operations involving data stored in computer systems. These operations include, but are not limited to, those requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated.
  • the operations described herein that form part of the invention are useful machine operations. The manipulations performed are often referred to in terms, such as, producing, identifying, running, determining, comparing, executing, downloading, or detecting. It is sometimes convenient, principally for reasons of common usage, to refer to these electrical or magnetic signals as bits, values, elements, variables, characters, data, or the like. It should be remembered however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities.
  • the present invention also relates to a device, system or apparatus for performing the aforementioned operations.
  • the system may be specially constructed for the required purposes, or it may be a general-purpose computer selectively activated or configured by a computer program stored in the computer.
  • the processes presented above are not inherently related to any particular computer or other computing apparatus.
  • various general-purpose computers may be used with programs written in accordance with the teachings herein, or, alternatively, it may be more convenient to construct a more specialized computer system to perform the required operations.
  • pan/tilt/zoom operations have been described above as physical operations of the digital camera 120 , these operations can also be implemented as digital image processing operations in a digital camera, such as a wide angle digital camera. Accordingly, other embodiments are within the scope of the following claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
US12/029,758 2007-02-16 2008-02-12 Providing area zoom functionality for a camera Abandoned US20080198178A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/029,758 US20080198178A1 (en) 2007-02-16 2008-02-12 Providing area zoom functionality for a camera
US15/224,798 US9774788B2 (en) 2007-02-16 2016-08-01 Providing area zoom functionality for a camera

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
EP07102569.6A EP1959389B1 (en) 2007-02-16 2007-02-16 Providing area zoom functionality for a camera
EP07102569.6 2007-02-16
US89197407P 2007-02-28 2007-02-28
US12/029,758 US20080198178A1 (en) 2007-02-16 2008-02-12 Providing area zoom functionality for a camera

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/224,798 Continuation US9774788B2 (en) 2007-02-16 2016-08-01 Providing area zoom functionality for a camera

Publications (1)

Publication Number Publication Date
US20080198178A1 true US20080198178A1 (en) 2008-08-21

Family

ID=38421671

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/029,758 Abandoned US20080198178A1 (en) 2007-02-16 2008-02-12 Providing area zoom functionality for a camera
US15/224,798 Active US9774788B2 (en) 2007-02-16 2016-08-01 Providing area zoom functionality for a camera

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/224,798 Active US9774788B2 (en) 2007-02-16 2016-08-01 Providing area zoom functionality for a camera

Country Status (6)

Country Link
US (2) US20080198178A1 (ja)
EP (1) EP1959389B1 (ja)
JP (1) JP4642868B2 (ja)
KR (1) KR100940971B1 (ja)
CN (1) CN101247461B (ja)
TW (1) TWI461058B (ja)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090251539A1 (en) * 2008-04-04 2009-10-08 Canon Kabushiki Kaisha Monitoring device
US20090300554A1 (en) * 2008-06-03 2009-12-03 Nokia Corporation Gesture Recognition for Display Zoom Feature
US20100141767A1 (en) * 2008-12-10 2010-06-10 Honeywell International Inc. Semi-Automatic Relative Calibration Method for Master Slave Camera Control
US20120218457A1 (en) * 2011-02-24 2012-08-30 Hon Hai Precision Industry Co., Ltd. Auto-focusing camera device, storage medium, and method for automatically focusing the camera device
US20120246013A1 (en) * 2008-07-07 2012-09-27 Google Inc. Claiming real estate in panoramic or 3d mapping environments for advertising
US20130329067A1 (en) * 2012-06-12 2013-12-12 Canon Kabushiki Kaisha Capturing control apparatus, capturing control method and program
US20150063705A1 (en) * 2013-08-29 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus, computer-readable medium for content aware multimedia resizing
US20150302633A1 (en) * 2014-04-22 2015-10-22 Google Inc. Selecting time-distributed panoramic images for display
US20170104924A1 (en) * 2013-02-22 2017-04-13 Leap Motion, Inc. Adjusting motion capture based on the distance between tracked objects
US20170147174A1 (en) * 2015-11-20 2017-05-25 Samsung Electronics Co., Ltd. Image display device and operating method of the same
US9706162B2 (en) 2010-04-14 2017-07-11 Sisvel Technology S.R.L. Method for displaying a video stream according to a customised format
US11775078B2 (en) 2013-03-15 2023-10-03 Ultrahaptics IP Two Limited Resource-responsive motion capture

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2382772B1 (en) * 2008-12-23 2012-10-03 TP Vision Holding B.V. Image scaling curve generation
KR101111503B1 (ko) * 2010-02-17 2012-02-22 (주)서광시스템 전방향 피티지 카메라 제어 장치 및 그 방법
CN101808200B (zh) * 2010-03-16 2011-12-21 浙江大学 一种基于感兴趣区域的相机测光方法
CN102447887B (zh) * 2011-10-14 2014-12-17 深圳市先河系统技术有限公司 在安防系统视频信号中改变其图像长宽比的方法及装置
CN103595912B (zh) 2013-09-30 2018-02-06 北京智谷睿拓技术服务有限公司 局部缩放的成像方法和装置
US10289284B2 (en) * 2014-11-25 2019-05-14 International Business Machines Corporation Viewing selected zoomed content
TWI547177B (zh) * 2015-08-11 2016-08-21 晶睿通訊股份有限公司 視角切換方法及其攝影機
CN105578275A (zh) * 2015-12-16 2016-05-11 小米科技有限责任公司 视频显示方法及装置
CN105867858A (zh) * 2016-03-22 2016-08-17 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
EP3485639A4 (en) * 2016-07-18 2020-03-04 Glide Talk, Ltd. SYSTEM AND METHOD FOR PROVIDING OBJECT-ORIENTED ZOOM IN MULTIMEDIA MESSAGING
JP6669087B2 (ja) * 2017-01-27 2020-03-18 京セラドキュメントソリューションズ株式会社 表示装置
KR102585216B1 (ko) 2017-12-14 2023-10-05 삼성전자주식회사 영상 인식 방법 및 그 장치
CN108683860A (zh) * 2018-08-28 2018-10-19 深圳市必发达科技有限公司 手机自拍方法、计算机可读存储介质及计算机设备
KR102565900B1 (ko) * 2019-01-30 2023-08-09 한화비전 주식회사 영역 줌 기능을 제공하는 촬상장치 및 그 방법
KR102074892B1 (ko) * 2019-04-16 2020-02-07 주식회사 디케이앤트 Cctv 카메라의 시야각을 이용한 ptz 정밀 제어 서버 및 방법
KR102074900B1 (ko) 2019-04-16 2020-03-02 주식회사 디케이앤트 Ptz 카메라의 에어리어줌 정밀 제어 방법 및 서버

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528289A (en) * 1993-10-20 1996-06-18 Videoconferencing Systems, Inc. Method for automatically adjusting a videoconferencing system camera to center an object
US5838840A (en) * 1996-08-29 1998-11-17 Bst/Pro Mark Inspection device using a field mode video camera with interpolation to replace missing pixels
US6052110A (en) * 1998-05-11 2000-04-18 Sony Corporation Dynamic control of zoom operation in computer graphics
US6396507B1 (en) * 1996-09-13 2002-05-28 Nippon Steel Corporation Data storage/access network system for zooming image and method of the storage/access
US20030122853A1 (en) * 2001-12-29 2003-07-03 Kim Jeong Woo Method for tracing enlarged region of moving picture
US6704048B1 (en) * 1998-08-27 2004-03-09 Polycom, Inc. Adaptive electronic zoom control
US6714692B1 (en) * 2000-02-16 2004-03-30 Korea Advanced Institute Of Science And Technology Image scaling method and apparatus using continuous domain filtering and interpolation method
US20050151885A1 (en) * 2003-12-08 2005-07-14 Lg Electronic Inc. Method of scaling partial area of main picture
US6971063B1 (en) * 2000-07-28 2005-11-29 Wireless Valley Communications Inc. System, method, and apparatus for portable design, deployment, test, and optimization of a communication network
US20060026170A1 (en) * 2003-05-20 2006-02-02 Jeremy Kreitler Mapping method and system
US20060126894A1 (en) * 2004-11-09 2006-06-15 Nec Corporation Video phone
US20080104027A1 (en) * 2006-11-01 2008-05-01 Sean Michael Imler System and method for dynamically retrieving data specific to a region of a layer
US7554522B2 (en) * 2004-12-23 2009-06-30 Microsoft Corporation Personalization of user accessibility options
US20090187863A1 (en) * 2005-03-10 2009-07-23 Nintendo Co., Ltd. Storage medium storing input processing program and input processing apparatus
US20100283796A1 (en) * 2004-03-03 2010-11-11 Gary Kramer System for Delivering and Enabling Interactivity with Images

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04196774A (ja) * 1990-11-28 1992-07-16 Hitachi Ltd ビデオカメラ装置
JP3315554B2 (ja) * 1994-11-28 2002-08-19 キヤノン株式会社 カメラ制御装置
JPH10200793A (ja) * 1997-01-09 1998-07-31 Nikon Corp ファインダー装置及びそれを備えたカメラ
JPH10336494A (ja) 1997-05-29 1998-12-18 Seiko Epson Corp ズーム表示機能付デジタルカメラ
JP3763383B2 (ja) * 1999-08-27 2006-04-05 株式会社デジタル 多角形描画装置および多角形描画方法、ならびに多角形描画プログラムを記録したコンピュータ読み取り可能な記録媒体
KR20020040948A (ko) * 2000-11-25 2002-05-31 윤종용 모니터링시스템 및 그 제어방법
JP4093053B2 (ja) * 2002-02-28 2008-05-28 セイコーエプソン株式会社 画像領域指定および画像修正
KR20050082789A (ko) * 2004-02-20 2005-08-24 주식회사 팬택앤큐리텔 휴대용 통신 단말기의 이미지 확대 표시장치 및 그 방법
JP2005276020A (ja) * 2004-03-26 2005-10-06 Ryoichi Eguchi 画像における選択範囲の設定方法及び選択定規
JP2005286926A (ja) * 2004-03-30 2005-10-13 Saxa Inc 撮像装置
JP2005328476A (ja) 2004-05-17 2005-11-24 Sony Corp 撮像装置、および固体撮像素子の駆動制御方法
US20080063389A1 (en) * 2006-09-13 2008-03-13 General Instrument Corporation Tracking a Focus Point by a Remote Camera

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528289A (en) * 1993-10-20 1996-06-18 Videoconferencing Systems, Inc. Method for automatically adjusting a videoconferencing system camera to center an object
US5838840A (en) * 1996-08-29 1998-11-17 Bst/Pro Mark Inspection device using a field mode video camera with interpolation to replace missing pixels
US6396507B1 (en) * 1996-09-13 2002-05-28 Nippon Steel Corporation Data storage/access network system for zooming image and method of the storage/access
US6052110A (en) * 1998-05-11 2000-04-18 Sony Corporation Dynamic control of zoom operation in computer graphics
US6704048B1 (en) * 1998-08-27 2004-03-09 Polycom, Inc. Adaptive electronic zoom control
US6714692B1 (en) * 2000-02-16 2004-03-30 Korea Advanced Institute Of Science And Technology Image scaling method and apparatus using continuous domain filtering and interpolation method
US6971063B1 (en) * 2000-07-28 2005-11-29 Wireless Valley Communications Inc. System, method, and apparatus for portable design, deployment, test, and optimization of a communication network
US20030122853A1 (en) * 2001-12-29 2003-07-03 Kim Jeong Woo Method for tracing enlarged region of moving picture
US20060026170A1 (en) * 2003-05-20 2006-02-02 Jeremy Kreitler Mapping method and system
US20050151885A1 (en) * 2003-12-08 2005-07-14 Lg Electronic Inc. Method of scaling partial area of main picture
US20100283796A1 (en) * 2004-03-03 2010-11-11 Gary Kramer System for Delivering and Enabling Interactivity with Images
US20060126894A1 (en) * 2004-11-09 2006-06-15 Nec Corporation Video phone
US7554522B2 (en) * 2004-12-23 2009-06-30 Microsoft Corporation Personalization of user accessibility options
US20090187863A1 (en) * 2005-03-10 2009-07-23 Nintendo Co., Ltd. Storage medium storing input processing program and input processing apparatus
US20080104027A1 (en) * 2006-11-01 2008-05-01 Sean Michael Imler System and method for dynamically retrieving data specific to a region of a layer

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Adobe Photoshop Basics Lesson 4a: Selection Tools; February 1, 2002; About.com; 1 page; http://graphicssoft.about.com/od/photoshop/l/bllps504a.htm *
Cropping, Panning and Zooming; January 18, 2004; Vegas; Tutorial; pages 1-5; http://users.wowway.com/~wvg/tutorial-14.htm *
Working with Selections; February 1, 2001; Adobe Photoshop 5.0; Tutorial; pages 1-14; http://people.csail.mit.edu/fredo/Depiction/1_Introduction/Tutorial/Lesson01/Lesson01.pdf *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9224279B2 (en) * 2008-04-04 2015-12-29 Canon Kabushiki Kaisha Tour monitoring device
US20090251539A1 (en) * 2008-04-04 2009-10-08 Canon Kabushiki Kaisha Monitoring device
US20090300554A1 (en) * 2008-06-03 2009-12-03 Nokia Corporation Gesture Recognition for Display Zoom Feature
US9436425B2 (en) * 2008-07-07 2016-09-06 Google Inc. Claiming real estate in panoramic or 3D mapping environments for advertising
US20120246013A1 (en) * 2008-07-07 2012-09-27 Google Inc. Claiming real estate in panoramic or 3d mapping environments for advertising
US9092833B2 (en) * 2008-07-07 2015-07-28 Google Inc. Claiming real estate in panoramic or 3D mapping environments for advertising
US20150286454A1 (en) * 2008-07-07 2015-10-08 Google Inc. Claiming Real Estate in Panoramic or 3D Mapping Environments for Advertising
US20100141767A1 (en) * 2008-12-10 2010-06-10 Honeywell International Inc. Semi-Automatic Relative Calibration Method for Master Slave Camera Control
US8488001B2 (en) * 2008-12-10 2013-07-16 Honeywell International Inc. Semi-automatic relative calibration method for master slave camera control
US9706162B2 (en) 2010-04-14 2017-07-11 Sisvel Technology S.R.L. Method for displaying a video stream according to a customised format
US20120218457A1 (en) * 2011-02-24 2012-08-30 Hon Hai Precision Industry Co., Ltd. Auto-focusing camera device, storage medium, and method for automatically focusing the camera device
US20130329067A1 (en) * 2012-06-12 2013-12-12 Canon Kabushiki Kaisha Capturing control apparatus, capturing control method and program
US9531935B2 (en) * 2012-06-12 2016-12-27 Canon Kabushiki Kaisha Capturing control apparatus, capturing control method and program
US9986153B2 (en) * 2013-02-22 2018-05-29 Leap Motion, Inc. Adjusting motion capture based on the distance between tracked objects
US10999494B2 (en) 2013-02-22 2021-05-04 Ultrahaptics IP Two Limited Adjusting motion capture based on the distance between tracked objects
US20170104924A1 (en) * 2013-02-22 2017-04-13 Leap Motion, Inc. Adjusting motion capture based on the distance between tracked objects
US9762792B2 (en) * 2013-02-22 2017-09-12 Leap Motion, Inc. Adjusting motion capture based on the distance between tracked objects
US20170374279A1 (en) * 2013-02-22 2017-12-28 Leap Motion, Inc. Adjusting motion capture based on the distance between tracked objects
US11418706B2 (en) 2013-02-22 2022-08-16 Ultrahaptics IP Two Limited Adjusting motion capture based on the distance between tracked objects
US10348959B2 (en) 2013-02-22 2019-07-09 Leap Motion, Inc. Adjusting motion capture based on the distance between tracked objects
US10638036B2 (en) 2013-02-22 2020-04-28 Ultrahaptics IP Two Limited Adjusting motion capture based on the distance between tracked objects
US11775078B2 (en) 2013-03-15 2023-10-03 Ultrahaptics IP Two Limited Resource-responsive motion capture
US9384412B2 (en) * 2013-08-29 2016-07-05 Samsung Electronics Co., Ltd. Method and apparatus, computer-readable medium for content aware multimedia resizing
US20150063705A1 (en) * 2013-08-29 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus, computer-readable medium for content aware multimedia resizing
US20150302633A1 (en) * 2014-04-22 2015-10-22 Google Inc. Selecting time-distributed panoramic images for display
US9972121B2 (en) * 2014-04-22 2018-05-15 Google Llc Selecting time-distributed panoramic images for display
US11150787B2 (en) * 2015-11-20 2021-10-19 Samsung Electronics Co., Ltd. Image display device and operating method for enlarging an image displayed in a region of a display and displaying the enlarged image variously
US20170147174A1 (en) * 2015-11-20 2017-05-25 Samsung Electronics Co., Ltd. Image display device and operating method of the same

Also Published As

Publication number Publication date
JP4642868B2 (ja) 2011-03-02
EP1959389B1 (en) 2017-11-15
JP2008206153A (ja) 2008-09-04
KR20080076846A (ko) 2008-08-20
TW200847772A (en) 2008-12-01
KR100940971B1 (ko) 2010-02-05
US20160344942A1 (en) 2016-11-24
CN101247461A (zh) 2008-08-20
TWI461058B (zh) 2014-11-11
US9774788B2 (en) 2017-09-26
CN101247461B (zh) 2011-12-21
EP1959389A1 (en) 2008-08-20

Similar Documents

Publication Publication Date Title
US9774788B2 (en) Providing area zoom functionality for a camera
US11481096B2 (en) Gesture mapping for image filter input parameters
AU2020100720B4 (en) User interfaces for capturing and managing visual media
US6992702B1 (en) System for controlling video and motion picture cameras
US10628010B2 (en) Quick review of captured image data
EP2283642B1 (en) Method, apparatus, and computer program product for presenting burst images
AU2022221466B2 (en) User interfaces for capturing and managing visual media
KR20160074658A (ko) 프리뷰 인터페이스 선택 영역의 줌인 방법 및 장치
CN111835972B (zh) 拍摄方法、装置和电子设备
KR100787987B1 (ko) 팬틸트 카메라의 제어장치 및 그 기록매체
JP3744995B2 (ja) 撮像方法とその装置
JP2018098627A (ja) 情報処理装置、情報処理方法、及びプログラム
CN116980759A (zh) 拍摄方法、终端、电子设备及可读存储介质
TW202013161A (zh) 操作互動觸控顯示系統之方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: AXIS AB, SWEDEN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JULIN, FRED;NILSSON, MARTIN;REEL/FRAME:020804/0653

Effective date: 20080207

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION