Abstract. Video search results and suggested videos on web sites are represented with a video thumbnail, which is manually selected by the video uploader among three randomly generated ones (e.g., YouTube). In contrast, we present a grounded user-based approach for automatically detecting interesting key-frames within a video through aggregated users’ replay interactions with the video player. Previous research has focused on content-based systems that have the benefit of analyzing a video without user interactions, but they are monolithic, because the resulting video thumbnails are the same regardless of the user preferences. We constructed a user interest function, which is based on aggregate video replays, and analyzed hundreds of user interactions. We found that the local maximum of the replaying activity stands for the semantics of information rich videos, such as lecture, and how-to. The concept of user-based key-frame detection could be applied to any video on the web, in order to generate a user-based and dynamic video thumbnail in search results.