It was a beautiful Sydney winter’s day to watch the Swans play Carlton in the last round of the regular season on Saturday. This match was my first big data collection, and from here I’ll begin processing audio and video data of the crowd, combined with some broader contextual information I’ll be gathering.
One key thing I need to do is establish a classification system for crowd activity – not only volume and intensity, but also sentiment and location. One thing you notice at the AFL specifically is how localized crowd noise can be, due to the enormous distance between one end of the oval and the other. The term ‘crowd’ tends to simplify and homogenize what is, in fact, a very diverse and dynamic system of individuals. It can be quite difficult to see what’s happening at the opposite end of the field, and the players only directly interact with the crowd towards the boundary. Using player GPS data and location specific audio recordings should provide a spatial account of this interaction.
So far, I’ve been immersing myself in Sports Science literature and techniques, learning how player performance is tracked, measured and interpreted.
Next week, I’ll be using a technique called stereophotogrammetry to create a 3D model of the SCG using an array of 6 GoPro Heros in a rig called the Omni.
By taking a few thousand still images from the grounds of the SCG, I’ll use special software to stitch together a 3D model of the stadium.
From here, I’ll be able to use the architecture of the stadium as a scaffold for some interesting immersive experiments with the GPS performance data I’ve been able to access from AFL players.
After this, I’ll start collecting data about sporting crowds, including audio, video and other forms of measurement, before thinking about ways to combine and cross-reference this data with player performance metrics in an immersive environment.