Scene Summarization via Motion Normalization

IEEE Trans Vis Comput Graph. 2021 Apr;27(4):2495-2501. doi: 10.1109/TVCG.2020.2993195. Epub 2021 Feb 25.

Abstract

When observing the visual world, temporal phenomena are ubiquitous: people walk, cars drive, rivers flow, clouds drift, and shadows elongate. Some of these, like water splashing and cloud motion, occur over time intervals that are either too short or too long for humans to easily observe. High-speed and timelapse videos provide a popular and compelling way to visualize these phenomena, but many real-world scenes exhibit motions occurring at a variety of rates. Once a framerate is chosen, phenomena at other rates are at best invisible, and at worst create distracting artifacts. In this article, we propose to automatically normalize the pixel-space speed of different motions in an input video to produce a seamless output with spatiotemporally varying framerate. To achieve this, we propose to analyze scenes at different timescales to isolate and analyze motions that occur at vastly different rates. Our method optionally allows a user to specify additional constraints according to artistic preferences. The motion normalized output provides a novel way to compactly visualize the changes occurring in a scene over a broad range of timescales.