Week 10 is finally here. Not only is this a short break for us but it is also a good time to catch-up on the main project. Since for the past few weeks, most of the time were spent on the Origami Robots and Design Fiction, we didn’t really have a good understanding of the scope of our theme. As most of us are new to machine learning, in particular, neural networks, we were still pretty clueless about the theme project. Hence, during this much needed break, we had to spend time individually to read up and research on the topic.
Basically, everyone went their own way during the week. Some chose to take a well-rested break and stayed in Hangzhou while others took the opportunity to travel around the country. As for a lazy person like me, I decided that staying indoors would be the best way to avoid the summer heat. Thus throughout the week, time was spent reading articles and research papers about neural networks. Having never dealt with EEG and image data at the same time, we didn’t even had the slightest clue on how to construct and design the architecture for the neural network, which API to use, how to preprocess the data, etc. It felt like starting on a blank sheet of paper, not knowing how to proceed with the given information.
Luckily, our TA guided us and told us to consider certain neural network models and also prompted us to start finding ways to preprocess the data. This provided us with a starting point and direction to work towards. We were told to look at Recurrent Neural Networks, LSTMs in particular, it is a sort of neural network that is able to ‘remember’ inputs that are time-based. Hopefully, we are able to come up with a suitable model before the week ends.