Meet the Mapping L.A. Boundaries API
Browse, download and reuse public maps of Southern California
Today we announce the launch of the Mapping L.A. Boundaries API, a new site that allows anyone to quickly browse, download and reuse dozens of different maps covering Southern California.
Visit boundaries.latimes.com and pick any point in the region. You’ll get back a list of areas it falls within, drawn from dozens of public datasets our team has compiled. The list includes U.S. Census tracts, city boundaries, political districts, police and fire jurisdictions, school zones and the L.A. County neighborhoods assembled, in partnership with Times readers, as part of our Mapping L.A. project.
Besides quick lookups and downloads—good when you need to pull a fact for a story or gather the raw materials for a simple graphic—the site is equipped with an API that allows programmers to write code that interacts with the database. For instance, you could submit a list of dozens of addresses and get back the City Council district they fall within.
The site is far from finished. We developed it to meet our needs and hope to gradually improve it as time goes on. But if it might be useful to you, we’d love your input.
- Currently access to the API is anonymous and throttled. Should we develop credentials to register users and provide more fulsome access?
- Some of our datasets are clipped at the coastlines, others are not. Should we create a taxonomy to record and organize each set’s coast?
- What kind of documentation would be useful?
- What data sets should we add?
- If a git repository holding the live site’s data and code were public, would you be interested in contributing to it?
Much respect due
It is powered by a modified version of django-boundaryservice, an excellent open-source project developed by the Chicago Tribune’s News Applications Team. We contributed a series of improvements to the codebase.
Though the code that outputs the data in GeoJSON, KML and SHP formats currently resides in our fork of the main project. And we’ve also changed the application to serve most files from a static archive on Amazon S3 to reduce the load on the database, which is currently hosted by a small server in Amazon’s EC2 system.