Collision-Free Video Synopsis Incorporating Object Speed and Size Changes
This paper presents a new surveillance video synopsis method which performs much better than previous approaches in terms of both compression ratio and artifact. Previously, a surveillance video was usually compressed by shifting the moving objects of that video forward along the time axis, which in...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - (2019) vom: 25. Sept. |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2019
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | This paper presents a new surveillance video synopsis method which performs much better than previous approaches in terms of both compression ratio and artifact. Previously, a surveillance video was usually compressed by shifting the moving objects of that video forward along the time axis, which inevitably yielded serious collision and chronological disorder artifacts between the shifted objects. The main observation of this paper is that these artifacts can be alleviated by changing the speed or size of the objects, since with varied speed and size the objects can move more flexibly to avoid collision points or to keep chronological relationships. Based on this observation, we propose a video synopsis method that performs object shifting, speed changing, and size scaling simultaneously. We show how to integrate the three heterogeneous operations into a single optimization framework and achieve high-quality synopsis results. Unlike previous approaches that usually use alternative optimization strategies to solve synopsis optimizations, we develop a Metropolis sampling algorithm to find the solution for our three-variable optimization problem. A variety of experiments demonstrate the effectiveness of our method |
---|---|
Beschreibung: | Date Revised 27.02.2024 published: Print-Electronic Citation Status Publisher |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2019.2942543 |