Sharing Game Data
In the end of my previous dev log, I alluded to exporting game state via an internal server that Wes wrote called the “Crawler Server.” The original indented use of the Crawler Server was to allow us to write internal web tools inspect game data like a character or a landmark, instead of having to be logged in to a cluster and use GM commands in the chat window. However, we soon realized we could use this functionality to export data to the public as well. Specifically, I was interested in exporting the port state (unrest levels, port resources, etc) and server victory state (the scoreboard displayed in the server victory dialog in game).
You might wonder why we can’t just write web tools that read the game database directly. The primary reason is that even though the game data is stored in a MS SQL database that a web app could theoretically query, all of the data is in encoded binary blobs, rather than one column per field (think of an excel spreadsheet with one player’s character information per row and the character stats like ‘level’ and ‘xp’ in each column). We use blobs because it’s faster to move game data in and out of the DB if it’s in one large binary chunk. We also have game code that intelligently upgrades the binary blobs if fields are added and removed. For example, we may add a new character stat and thus need to upgrade existing characters in the DB, who were created before this new stat ever existed. Also, there is the issue of maintaining integrity of the game data. Web tool development is usually more fast and loose, and thus more prone to corrupting data on accident. It would be a bummer if we accidentally corrupted character due to a bug in one of our web tools.
Enter the “Crawler Server.” Its job is to walk through the game database and save its data to the crawler database in expanded excel spreadsheet style that a web tool can easily read. It does this by “crawling” through the game data on a regular interval (once every few minutes) looking for any game objects that have changed. The blobs have a last updated timestamp and a last “crawled” time that lets us determine if we need to crawl that object again. If a blob changed, we parse the blob and write out the individual fields to the crawler DB that all of our web tools can access.
The next step is actually getting the game data exposed to the general public. The easiest way to do this would be to just allow people to remotely connect to the crawler database and let them run queries against it. However, doing that is bad for several reasons. First of all, there are the obvious security concerns. IT would have to make sure that the crawler DB would be isolated from the rest of our game cluster and that DB accounts to access the data are set up correctly. Our poor overworked IT department really doesn’t have time for this. The second problem is that of caching query results. Many users of the data are going to be running the same kinds of queries. It’s more efficient to cache the kind of data people want access too in some conveniently fetch-able form. Finally there is the issue of data filtering. The Devs have data we want from the crawler, but there are fields we might not want the general public to have, or just aren’t very interesting/accessible to the general public in their current form.
All of the aforementioned reasons motivate a data export path from the crawler database. The current solution we have in place is a set of ASPX pages on our internal tools server that generate XML chunks containing data we want to export. We currently have three such ASPX pages. The first page generates an XML chunk with a list of game clusters that players can log into, their version #, and most importantly their status (online/offline/admin). With that information, people can generate a PotBS status web page they can check to see which clusters are online without having to open a game client (i.e. “Rackham is back up after maintenance!”). The second XML chunk is a list of all the ports in the game with PvP information (unrest level, unrest state, etc), world location, and economic resources, etc. This info can be used to make a real time world map on a web page that shows where PvP is going on in the world and where to look to set up setup economic structures. Finally, the last XML chunk gives a breakdown of the current sever victory status. All of the same info that’s available in the server victory status window in game is available in from XML chunk. We have an API write-up that describes the format of these XML chunks and how to get access to them: Data Service API (PDF), but I’ll give the highlights here.
Export data is retrieved from a set of URLs at http://data.burningsea.com via the HTTP POST protocol. As a parameter to these URLs you must pass in a “developer key.” You can retrieve your developer key at this page. We use these developer keys to limit the rate at which a given user retrieves data from the data export website to prevent it from getting overloaded (max retrieval rates posted in the API doc). These time limits mean that you have to cache the results on your end. For example, if you are planning on creating a society website that has a map of port unrest for your server you’ll want to write a server side web script to grab the landmark XML chunk from our data site if and only if your local copy of the XML chunk is older than 10 minutes (or you haven’t grabbed this data before).
Joe has called game data export “User Content for Programmers,” because it allows technically minded players to participate in the content creation process in new and exciting ways. In the coming months we have plans export more game data in this manner in several stages. Many have been clamoring for us to export static and dynamic economic information. Static information being things like cargo types, consumables, skills, recipes, European traders and the dynamic information being things like auction listings, volume of money in the economy. The final step will be exporting character and society specific information so that you can look at character stats and society info online. Of course, the more we expose about smaller groups or players the more we have to be careful about privacy and game play issues introduced due to exposing such data. The developer key I mentioned earlier will help, in that it will eventually be used let players regulate who has access to data about their character (by default, no one but themselves). However, economic and character data export is still in the early planning phases at this point, and as such. I encourage you to discuss and debate the possibilities with me on “In Concept” section of our forums. I look forward to hearing your thoughts…