If you are in the news business you are likely thinking about location based services and geo-content in general.
In my role as a developer at Digital First Media, it is an especially important topic as we try to build centralized sites and services that can work across hundreds of local news sites.
It makes sense for the business. Location based marketing is an area we need to be leaders in. We always have been, and the fast changing and still emerging landscape in both mobile and personalization means there is still tons of room to carve out our share.
This is particularly important to organizations like ours with legacy print products. Zoned inserts are still a big part of the newspaper business, but a scary one too, because the advances in location based targeting online means a significant disruption is ahead. When the money moves online, we need solutions in place to fulfill the needs of our local customers.
Now, it’s not that I suddenly drank some kool-aid being served in the sales department open-house. I don’t get invited to those anymore. I’m mentioning this stuff as a developer because there is a truly awesome part to this all.
This location based personalization is also often exactly what our users want as well. It is a rare opportunity in this industry that the stars align so that the editorial, technology , sales AND (most importantly) the site users all want the same thing.
I’m so bullish on this, I’d say that if the industry can execute well on location based services online it could revive a lot of companies. But I think we need to be smart about it. No one department can be “product owner” here. We need to build things that users love. The rest will fall in line.
Thanks to Superstorm Sandy, I had some downtime where I could test a few technologies out. During the election, using some tools for reverse geocoding available from Open Street Maps, I was able to aggregate tweets mentioning either “Romney” or “Obama” and put them into buckets based on their locations.
Earlier this year, I had done some prototypes using the Alchemy API, in conjunction with a number of experiments being done for the Citizens’ Agenda project at Jay Rosen’s NYU Studio 20. I grabbed some old code left on the editing room floor and was able to do some sentiment analysis on those aggregated tweets, to try and track the aggregate moods for each term.
The results were surprising to me. The analysis returned more positive results for the term “Romney” leading up to election night. Once it became clear Obama was the winner, the term “Obama” rose a bit, as might be expected, though the overall rating for that term stayed negative the whole time. I can think of lots of cool visualizations and uses for this type of data. Maybe an opportunity for an upcoming hackathon.
While it is clear we aren’t predicting elections yet, I see a lot of potential in this space for all kinds of usage in journalism and marketing.
At Digital First Media, I’m working on some server side geo-location tools and I also see many of our journalists thinking about maps and location from there end. The tools out there range from expensive and closed to free and open. I personally want two things. Automation and API’s. And using Fusion Tables and Maps you can cobble together some cool things. I also want persistent data. This needs to grow over the years and become more valuable as it does so.
I’m a big fan of PostgreSQL but it was time to learn a little MongoDB and it also has a dead simple way to do 2D location searches with its $near query.
Anyhow, I was able to pretty quickly get a small app running that does one thing very simply. It stores geojson content and then returns that content based upon whether it is within a certain radius of the coordinates you query for.