With just one week left till the submission deadline of our DATE theme project, it’s been getting really frantic for me and my fellow AI Empowered Design members. Frantic enough, in fact, that I’ve only gotten around to posting this now (I’d originally intended to post on Sunday/Monday).

While I’d previously expressed concerns about the potential lack of substance in our project (in that we were only focusing on the ideas, and not the execution), there was still plenty of actual legwork to be done. Even though we are no longer developing a fully functional application for our live video abstraction project, we still needed to come up with a way to demonstrate our project’s methodology to our audience. And thus, I’ve spent the past week coming up with a sort of user interface to represent this.

The main focus of this iteration of the UI is a series of animations which demonstrate how our (hypothetical) application would construct an abstracted version of a live video broadcast. Roughly, a number of metrics would be sampled from the full length broadcast, which would then be combined to give a final metric that would be used to determine which portions of the broadcast are noteworthy enough to be included in an abstracted version.

We settled on using Pygame to build the UI, due to our existing experience with Python, and the built-in GUI functionalities which it offered. While it probably isn’t what most people would think of when choosing a library to create a user interface, it turns out that the tools needed for writing a video game also include those required for coming up with a simple UI and a series of animations.

There’s still plenty more work to be done, though; namely, adding in more interactivity and customization to the application.

Click to rate this post!
[Total: 0 Average: 0]

LEAVE A REPLY

Please enter your comment!
Please enter your name here