Twitterverse was a data visualisation concept, using keywords in tweets to generate a low-poly universe for a user to explore. Early concepts included dynamically created lists and options, however, for the prototype, these were replaced with pre-generated terms. Ideally, the results would have had geolocking capabilities, giving the installation some whimsy as you browsed tweets about a festival or location you are attending.
Both methods would be filtered to only return tweets that include #hashtags as these will play a role in universe generation. The maximum number of return results from the Twitter Search REST API is 100 tweets, so if this range is too sparse to generate interesting enough universes multiple queries may need to be configured.
To allow for a visualisation which grows and expands over time the project installation could be setup to collect new tweets at certain time intervals via Max. As long as Max is running it could poll to the server to grab new tweets at a rate which wouldn’t run into issues with Twitter’s API rate limit restrictions (a maximum polling speed of 3 minutes should prevent running into rate limiting issues). The API’s MaxID/MinID and or Until/From parameters could also be used to avoid getting the same tweet multiple times.
With links to real world data (even if it used rather abstractly) the project is well positioned to be used in conjunction with events conferences, festivals were it could be used to display a Twitterverse based around that event via a projector. It could also be established as a stand alone installation, potentially with multiple instances of the project running side by side to present different Twitterverses based on different topics or locations.
Using an Xbox Kinect camera to achieve motion capture and power navigation, the entire project was built in Max7. The Twitterverse project is functional, however, it needs more development and refinement to reach its full potential.