Welcome to Planet PostGIS

January 28, 2015

Postgres OnLine Journal (Leo Hsu, Regina Obe)

Import Foreign Schema hack with OGR_FDW and reading LibreOffice calc workbooks

PostgreSQL 9.4 and below doesn't support importing whole set of tables from a FOREIGN server, but PostgreSQL 9.5 does with the upcoming Import Foreign Schema. To use will require FDW wrapper designers to be aware of this feature and use the plumbing in their wrappers. IMPORT FOREIGN SCHEMA for ogr_fdw come PostgreSQL 9.5 release is on the features ticket list. The ogr_fdw comes with this to die for commandline utility called ogr_fdw_info that does generate the table structures for you and will also list all the tables in the Foreign data source if you don't give it a specific table name. So with this utility I wrote a little hack involving using PostgreSQL COPY PROGRAM feature to call out to the ogr_fdw_info commandline tool to figure out the table names and some DO magic to create the tables.

Though ogr_fdw is designed to be a spatial foreign data wrapper, it's turning out to be a pretty nice non-spatial FDW as well especially for reading spreadsheets which we seem to get a lot of. This hack I am about to demonstrate I am demonstrating with LibreOffice/OpenOffice workbook, but works equally well with Excel workbooks and most any data source that OGR supports.


Continue reading "Import Foreign Schema hack with OGR_FDW and reading LibreOffice calc workbooks"

by Leo Hsu and Regina Obe (nospam@example.com) at January 28, 2015 04:57 AM

January 23, 2015

Postgres OnLine Journal (Leo Hsu, Regina Obe)

Installing PostGIS packaged address_standardizer on Ubuntu

One of the changes coming to you in PostGIS 2.2 are additional extensions. Two ones close to my heart are the address_standardizer (which was a separate project before, but folded into PostGIS in upcoming 2.2) and the SFCGAL extension for doing very advanced 3D stuff (was just an sql script in older versions, but made an extension in 2.2 and new functions added). We had a need to have address standardizer running on our Ubuntu box, but since PostGIS 2.2 isn't released yet, you can't get it without some compiling. Luckily the steps are fairly trivial if you are already running PostGIS 2.1. In this article, I'll walk thru just building and installing the address_standardizer extension from the PostGIS 2.2 code base. Though I'm doing this on Ubuntu, the instructions are pretty much the same on any Linux, just replacing with your Linux package manager.


Continue reading "Installing PostGIS packaged address_standardizer on Ubuntu"

by Leo Hsu and Regina Obe (nospam@example.com) at January 23, 2015 02:54 AM

January 20, 2015

Safe Software

6 Google Maps Engine Alternatives

Google has announced that they’re ending support for Google Maps Engine. If you’ve used Google Maps Engine to store your vector and raster data, you should be aware that your GME data will disappear on January 29, 2016.

No better time than the present to migrate your geographic data to a replacement Maps Engine solution. But where to go? What technology can give you everything GME did?

  • A place to store your geographic data in a web-based environment.
  • Powerful visuals for inspecting and analyzing your data.
  • Easy collaboration on maps.
  • Real-time capabilities like dynamic rendering, alerts and notifications.

Let’s look at a few of the alternatives we’re aware of, and how you can migrate your data there without losing anything important. Please add your own suggestions in the comments.

You might find that one solution suits your requirements better than the others. I’d suggest testing out as many alternatives as you can before making your own decision about which is best.

Check out my follow-up post, How to Migrate your Google Maps Engine Data, to see how to get your data into whatever system you choose.

1. ArcGIS Online

If you want everything you had in Maps Engine, then ArcGIS Online is probably your solution of choice. It’s made for creating interactive web maps and apps, and comes complete with all the analytical power and authoritative data you’d expect from an Esri product. Like GME, AGOL renders maps dynamically, which means it offers better visualization for data that’s constantly being updated.

The ‘apps’ component is interesting: you can turn any of your feature services into apps, adding a new level of power to your data. AGOL also has no quotas or rate limits, offering more to organizations with high data volumes. They also have ArcGIS Open Data, which, if you’ve been using Google Map Gallery with Maps Engine, is an equivalent public dataset collection and distribution platform.

Getting your data out of Google Maps Engine and into ArcGIS Online is a one-step process using FME. Note you don’t need to have ArcGIS Desktop in order to use ArcGIS Online.

2. CartoDB

CartoDB is also a full solution that would work nicely as a GME replacement. In fact, they’ve just published a blog comparing the two and describing how easy it is to move your data from one to the other.

In CartoDB, the web-based environment and spatially aware database components are handled by PostGIS in the cloud, while the visualization, collaboration, and analysis aspects are presented simply and beautifully in a web interface.

I need to stress the ease-of-use here. Last week I needed CartoDB for a demo video, and was apprehensive about needing to learn how to use it first. I created an account, logged in, and—that was it. I already knew how to use it. The simplicity of importing data, creating a map, and choosing how to represent my data was immediately obvious. Kudos, CartoDB.

CartoDB also offers dynamic rendering and geo-temporal visualizations, giving you real-time maps and the ability to see data as it happens.

Again, you can extract your data from Google Maps Engine and load it into CartoDB in one step using FME.

3. iSpatial

iSpatial is a complete map creation, visualization, storage, collaboration, and analysis solution. From their website: “iSpatial is a web-based collaborative framework that leverages Google Earth and Maps in a flexible, task-based approach to solving complex problems.”

In addition to location-based inspection and analysis, it’s set up for real-time reporting and alerts management.

This technology is based on Google Earth/Maps, so if it’s the visual Google experience you’re after, this could be your solution. It also leverages Postgres/PostGIS in the back end.

And, yep, FME can extract your data from Google Maps Engine and put it into PostGIS in one step too.

4. A combination of Google products

Google, of course, suggests replacing GME with a combination of their own solutions. Google Maps/Earth is obviously great for visualizing and creating maps, but what about the rest of what Maps Engine offered?

My Maps is basically the map creation, visualization, and collaboration aspect of GME (up until a few months ago, it was actually called Google Maps Engine Lite). Storage, then, can be handled by migrating your data to Google Cloud SQL—a MySQL database in the cloud. Great, so now we have the web-based environment and database storage aspects.

Next, to bridge the two. Once you get your data from Google Maps Engine into the cloud-hosted MySQL database, you’ll have to follow their guide to build a spatial app using the Google Maps API. They’ve also provided an example vector data app and self-serve raster app for reference. Building stylized maps then involves a bit of JavaScript programming. Not an impossible migration, but not as straightforward as some of the solutions above, either.

If this is your solution, then guess what? You can extract your data from Google Maps Engine and put it into Google Cloud SQL in one step using FME. Google even published a page describing how.

5. Mapbox

Mapbox is a great way to publish your geographic data. It’s scalable, cheap, and open source. This, however, is more of a “partial” GME replacement, and will need to be combined with other technology if you want to have everything you knew and loved about Maps Engine. For instance, it lacks some of the powerful data management and analysis tools found in the solutions above.

Storage of your map tiles is handled by a SQLite database (which FME also migrates to). On the front end, Mapbox uses TileMill for creating interactive maps—a design environment that leverages the stunning cartographic abilities of Mapnik. Creating web map tiles with FME is a common scenario, and MBTiles is on our radar for an upcoming release (potentially FME 2016).

6. Your own custom integration

jigsaw-305576_1280In addition to those above, FME supports writing to many other cloud spatial data storage systems. If you’re not afraid of writing a bit of JavaScript, you can pick your favourite storage system (e.g. Amazon RDS or Aurora), your favourite map creation library/framework, and bridge the two using JavaScript.

Regardless of which technology you choose, rest assured you can easily migrate your data without losing anything. Google provided this page to walk you through exporting your stuff, where FME is one of the recommended migration strategies.

Extracting your data from Maps Engine with FME is as simple as dropping down a reader and entering your credentials. From there, you can move it to any of 335+ alternate systems, including those listed above. FME also offers hundreds of data transformation and quality control options if you need to manipulate the content or structure before loading it into another system.

As always, don’t hesitate to contact our most excellent support team if you need help getting your data wherever it needs to go.

If you’ve been using Google Maps Engine for your geographical data, please share: What alternative solution do you plan on using? Can you suggest any others?

See also: How to Migrate your Google Maps Engine Data

The post 6 Google Maps Engine Alternatives appeared first on Safe Software Blog.

by Tiana Warner at January 20, 2015 10:56 PM

Oslandia

Oslandia vous souhaite ses meilleurs voeux pour 2015


Toute l'équipe d'Oslandia se joint à nous pour vous souhaiter ses meilleurs vœux pour l'année 2015, pleine de réussite professionnelle et de joies personnelles.


C'est avec un dynamisme et un enthousiasme renouvelé que nous entamons cette année. Oslandia a en effet 5 ans révolus depuis peu, et cette maturité nous conforte dans notre croissance.


Nous prévoyons un élargissement des équipes ainsi que de nos compétences. Nos travaux de R&D sur la 3D se poursuivent, et devraient permettre à de nouvelles applications de voir le jour. Des outils pour le transport, la gestion de l'eau, la simulation sont également dans nos plans pour 2015.


Pour accompagner cela, notre nouveau catalogue de formation sera bientôt disponible. Nous organiserons également cette année des ateliers gratuits pour présenter nos produits et nos services.


Restez donc en contact, nous sommes impatients de collaborer avec vous sur vos projets, et 2015 promet d'être passionnante !

Pour toute information ou demande : infos@oslandia.com


Happy New Year 2015 !

The whole team at Oslandia joins us to wish you all the best for 2015, full of professional success and joy in your personal life.


We start this year with renewed dynamism and enthusiasm. Oslandia just turned 5, and this maturity makes us confident in our growth.


We plan for new team members this year, as well as new competences. Our research and development work on 3D goes on, and should allow for new applications. Software components for transportation, water management or simulation are also in our scope for 2015.


Our new training catalog should be available soon including exciting new topics. We will also organize in 2015 a set of free workshops to present our products and services.


Keep in touch with us, we are looking forward to collaborate with you on your projects. 2015 promises to be a fascinating year !

For any information or enquiry : infos@oslandia.com


See you soon,

Vincent Picavet, Olivier Courtin and the Oslandia team

by Vincent Picavet at January 20, 2015 11:00 AM

January 12, 2015

Boston GIS (Regina Obe, Leo Hsu)

PostGIS funding

A few people have asked how they can contribute to PostGIS work and sadly PostGIS is very disorganized in this area. Part of the problem is no one knows if setting up the work to collect the funds would be worthwhile, whose job it would be to do it, how that person would be reimbursed for the accounting work needed, and how that money would be distributed.

That said, I have decided to start on my own to ask for funding for the recurring PostGIS maintenance work that I do and to treat it as consulting income. This is partly to test the waters to see if getting a more formal funding setup is worthwhile and also so I can devote more time to PostGIS maintenance without becoming a pauper. If you are interested in funding PostGIS maintenance work, refer to this page Paragon PostGIS funding.

It is my hope that eventually we'll have a specific PostGIS fund managed by the PostGIS PSC that can be used to support the work of package maintainers and subsidize travel expenses of PostGIS developers. The PostGIS developer group is very global and getting people together or to go to conferences is expensive. Sadly many of the developers on the PostGIS team who would love to attend conferences on PostGIS behalf are not able to do so. I almost want to cry every time I hear this. Perhaps one day the PostGIS development group can afford to be in the same room together.

by Regina Obe (nospam@example.com) at January 12, 2015 09:40 AM

January 03, 2015

Postgres OnLine Journal (Leo Hsu, Regina Obe)

Updated Foreign Data Wrappers for PostgreSQL 9.3 Windows

As stated in last article, I've packaged FDW binaries for PostgreSQL 9.3 windows 32-bit and 64-bit and added in the ogr_fdw one. These we've tested with the standard EDB Vc++ built PostgreSQL windows installs and work fine with those.

This package is an updated list from ones we've distributed before that includes ogr_fdw and recompiled with latests source from www_fdw and file_textarray


Continue reading "Updated Foreign Data Wrappers for PostgreSQL 9.3 Windows"

by Leo Hsu and Regina Obe (nospam@example.com) at January 03, 2015 10:05 PM

Boston GIS (Regina Obe, Leo Hsu)

Building GDAL under mingw-w64 with native windows ODBC support

When ogr_fdw came out, I was very excited to try ogr_fdw out on windows. To start with I used the default GDAL that we package with PostGIS, which is built under mingw-w64 (both 32-bit and 64-bit versions). Then I thought imagine how much more I can get by compiling with more drivers. A big one on my list was ODBC. Sadly this did not work out of the box under mingw-w64.

For windows folks who want to try out the ogr_fdw, I have binaries for PostgreSQL 9.4 FDWs which includes ogr_fdw and some of my other favorites. I do have the 9.3s as well, but haven't written up an article about them until I test them on my production instances. The links to those are the same as 9.4, just replace 94 with 93 in the download link. These include ODBC support which should allow you to query a lot of ODBC datasources.

Normally when I try to compile something that depends on ODBC using mingw, the configure scripts or make files are setup to assume you are using UnixODBC (which works but seems to use a different ODBC Manager and drags in all these extra dependencies). The way I normally fix this to use the native ODBC support is to replace all references to -lodbc in the configure script with -lodbc32. Yes even the windows 64-bit ODBC is called odbc32 but put in the system32 folder for 64-bit and SysWow64 for 32-bit. I imagine Microsoft came up with this confusing convention for backward compatibility. So I did that for GDAL, and to my horror that did not work.

What was the problem? GDAL had this extra dependency on odbcinst library. Although the odbcinst header is packaged with mingw-w64, there is no associated library. Searched thru my windows/system32 folder and my mingw-w64 libodbc*.a files and couldn't find such a thing. So I removed this thing from the configure script, and GDAL at least got past configure and happily started compiling, and then failed when linking. So I searched for the function it couldn't find, and discovered this was in a library called odbccp32. So to compile with odbc support, as noted in my ogr_fdw_build gist

  1. edit the configure script in gdal source tree and replace all references to -lodbc with -lodbc32, and -lodbcinst with -lodbccp32
  2. Then add in your configure --with-odbc=/mingw/${MINGHOST} where MINGHOST is set to x86_64-w64-mingw32 for the 64-bit mingw-w64 chain and i686-w64-mingw32 for the 32-bit mingw-w64 chain.

With this I was able to build libgdal with native windows ODBC support that just depends on the windows packaged odbc system dlls (so no extra dependencies). I tested by querying one of my clients SQL Server databases via ogr_fdw. One issue I ran into is it didn't handle SQL Server datetime right and kept on giving error ERROR: timestamp(62258384) precision must be between 0 and 6. So I had to change the ogr_fdw_info generated definition to bring in datetimes as varchar instead of timestamp and then just cast it to timestamp as part of my PostgreSQL query. This may be an idiosyncracy with how lengths have changed in windows ODBC that I have to patch or something with the ogr_fdw. I haven't tried the MSSpatial driver, but that suddenly showed up as an option after compiling with ODBC support. Also had some issues with UTF encoding which I was able to work thru by stripping high-byte characters in my PostgreSQL query.

One of my plans coming PostGIS 2.2 is to package a GDAL with more drivers - next on my list being SQLite family. I also need to figure out a way to have PostGIS gdal be swappable with the Visual C++ built ones. I can do that with for example curl and libxml, but the VC gdal last I looked seemed to expose weird named symbols so never hooks up. If I can get this going, then people can just swap out the GDAL we package with PostGIS with their own to get proprietary drivers like MrSID and so forth. Why don't I just try to build with it with VC++? Because then I've got to ship a VC++ runtime if I don't build with the same one EDB is using, and they use different flavors for each version of PostgreSQL. I also feel uncomfortable around Visual Studio for anything other than web development.

by Regina Obe (nospam@example.com) at January 03, 2015 12:19 AM

December 28, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

Foreign Data Wrappers for PostgreSQL 9.4 Windows

As stated in last article, I've packaged FDW binaries for PostgreSQL 9.4 windows 32-bit and 64-bit and added in the ogr_fdw one. These we've tested with the standard EDB VS built PostgreSQL windows installs and work fine with those.


Continue reading "Foreign Data Wrappers for PostgreSQL 9.4 Windows"

by Leo Hsu and Regina Obe (nospam@example.com) at December 28, 2014 11:02 AM

December 27, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

OGR foreign data wrapper on Windows first taste

This christmas I received something very special from Paul Ramsey and Even Roualt as detailed in Foreign Data Wrappers for PostGIS. It's been something I've been patiently waiting for for 4 years. I think it has a few issues I'm working to replicate, but overall it's much faster than I expected and pretty slick.

So why is ogr_fdw so special, because GDAL/OGR is an avenue to many data sources, NOT JUST GEOSPATIAL. It's the NOT JUST that I am most excited about. Though the focus is geospatial you can use it with non-geospatial datasources, as we described a long time ago in OGR2OGR for data loading


Continue reading "OGR foreign data wrapper on Windows first taste"

by Leo Hsu and Regina Obe (nospam@example.com) at December 27, 2014 09:26 PM

December 19, 2014

Boundless Geo

OGR FDW FTW!

Merry Christmas, nerds! Have you ever wanted to connect your PostGIS database directly to an existing GIS file or database and read from it without importing the data. I have good news, repeat after me: OGR FDW FTW!

(How did this happen? Boundless provides its engineers with “innovation time” to pursue personal technical projects, and this year I chose to blow all my time in one go on this project. Like the idea of innovation time? Join us!)

OGR, is the vector subsystem of the GDAL open source format library. The OGR API lets applications read and write to many different formats (Shapefile, Oracle, SDE, MapInfo, PostGIS, FGDB, GML, etc) without having to import them first.

FDW, is the PostgreSQL API for implementing “foreign data wrappers”: virtual tables in the database that are actually connections to remote data files, repositories and servers. There are FDW implementations to connect to MySQL, Oracle, SQLite, and even flat text files!

FTW, is “for the win”! Because the OGR API is a simple table-oriented read-write library that is practically begging to be hooked up to the PostgreSQL FDW system and expose OGR data sources as tables in PostgreSQL.

Here’s how it works.

First, go to the source code repository, build and install the ogr_fdw extension.

Next, create the ogr_fdw extension and the postgis extension.

CREATE EXTENSION postgis;
CREATE EXTENSION ogr_fdw;

Now create a “server”. For a database FDW, the server would be an actual database server somewhere. For the OGR FDW, a server is an OGR connection string: it could be a database server, a directory full of files, or (as in this case) a web service:

CREATE SERVER opengeo
  FOREIGN DATA WRAPPER ogr_fdw
  OPTIONS (
    datasource 'WFS:http://demo.opengeo.org/geoserver/wfs',
    format 'WFS' );

Now create a “foreign table”. This will look just like a table, to the database, but accessing it will actually create an access to the remote server.

CREATE FOREIGN TABLE topp_states (
  fid integer,
  geom geometry,
  gml_id varchar,
  state_name varchar,
  state_fips varchar,
  sub_region varchar,
  state_abbr varchar,
  land_km real,
  water_km real )
  SERVER opengeo
  OPTIONS ( layer 'topp:states' );

Now, treat the table like you would any other PostGIS table, and ask it a question in SQL:

SELECT st_area(geom::geography) 
FROM topp_states 
WHERE state_name = 'Missouri';

And the answer comes back: 180863 sq km.

How does it work? The PostgreSQL query fires off an OGR query to the server (in this case, the OpenGeo Suite demo server) which pulls the table down, and it is then filtered and calculated upon locally in PostgreSQL.

Could it be better? Sure!

It could push SQL restriction clauses down to the OGR driver, reducing the quantity of data returned to the server. For big tables, this will be very important.

It could restrict the number of columns it returns to just the ones needed for the query. This will make things a little faster.

It could allow read/write access to the table, so that INSERT, UPDATE and DELETE queries can also be run. This opens up a whole world of interoperability possibilities: imagine your database being able to directly edit a File Geodatabase on the file system? Or directly edit an ArcSDE server in a workgroup?

The biggest limitation of the PostgreSQL FDW system is that it requires a table definition before it can work, so you require a priori knowledge of the table structure to set up your tables. Because this just creates busywork, I’ve also bundled a utility program with the ogr_fdw extension: ogr_fdw_info. The utility will read an OGR data source and layer and return the SQL you need to create an FDW server and table for reading that layer.

Enjoy wrapping your foreign tables, and enjoy the holidays!

The post OGR FDW FTW! appeared first on Boundless.

by Paul Ramsey at December 19, 2014 05:11 PM

PostGIS Development

Foreign Data Wrappers for PostGIS

The last couple weeks have seen two interesting updates in the world of PostgreSQL “foreign data wrappers” (FDW). Foreign data wrappers allow you to access remote data inside your database, exactly like other tables. PostgreSQL ships with two example implementations, one for accessing text files, and the other for accessing remote PostgreSQL servers.

The two updates of interest to PostGIS users are:

  • The Oracle FDW implementation was recently enhanced to support spatial columns, so an Oracle table with SDO_GEOMETRY columns can be exposed in PostgreSQL as a table with PostGIS geometry columns.
  • A new OGR FDW implementation was released that supports exposing any OGR data source as a table with PostGIS geometry columns.

Now you can access your PostGIS data without even going to the trouble of importing it first!

by Paul Ramsey at December 19, 2014 12:00 AM

December 18, 2014

Oslandia

New version of "Tempus", a framework for multimodal route planning


Oslandia and IFSTTAR are pleased to announce the release of a new version of their multimodal route planning framework: Tempus



Tempus is an Open Source C++/Python framework that allows to use, develop and test route planning algorithms with a particular focus on multimodal routing and multi-objective requests, where every possible transport modes are taken into consideration for a trip: private cars, public transports, shared vehicules, bikes, etc.



Tempus features :

  • a C++ core based on Boost::graph for efficient and generic multimodal graph manipulations
  • a (dynamically loadable) plugin architecture for routing algorithms
  • exposure of its API as a webservice through a FastCGI WPS server that is able to respond to simultaneous requests
  • a client for QGIS
  • a loader script that is able to load road network data from Navteq, TomTom and OpenStreetMap, public transport data from GTFS and point of interests from CSV or shapefiles
  • a Python request API

Tempus is distributed under the terms of the GNU LGPLv2+


Tempus is on GitHub, get it, fork it, PR it : https://github.com/ifsttar/Tempus


The focus of Tempus is both on delivering an engine able to run efficient state-of-the-art routing algorithms and on offering a complete development environment to designers of routing algorithms. A complete set of data from different sources can be integrated into a main (PostGIS) database thanks to the loader part. The modular architecture allows to develop algorithms as plugins for the core. Some mechanisms are dedicated to help during the development: user variables for plugins, possibility to output parts of the traversed graph, user metrics, etc.


This 1.2.0 release brings:

  • a routing algorithm able to deal with turning movement restrictions for each transport mode
  • a reverse graph adaptor that allows to see the graph reversed. Requests with an 'arrive before' constraint can then be processed.
  • a tool (in the QGIS plugin) to select a subset of the graph to work on
  • a support for speed profiles on roads
  • an implementation of the A* algorithm

As for previous releases, a documentation is available, including notes for compilation and installation.


A Windows installer is also available. It can be used to install Tempus on Windows, as well as (if necessary) QGIS 2.4 and PostGIS 2 !


Developments are still going on to include state-of-the-art algorithms, such as contraction hierarchies or Raptor.


Please let us know if you are interested in development, training or consulting around these pieces of technology. Do not hesitate to contact us at infos@oslandia.com .

by Hugo Mercier at December 18, 2014 09:00 AM

PostGIS Development

PostGIS 2.1.5 Released

The 2.1.5 release of PostGIS is now available.

The PostGIS development team is happy to release patch for PostGIS 2.1, the 2.1.5 release. As befits a patch release, the focus is on bugs, breakages, and performance issues

http://download.osgeo.org/postgis/source/postgis-2.1.5.tar.gz

Continue Reading by clicking title hyperlink ..

by Paul Ramsey at December 18, 2014 12:00 AM

December 12, 2014

Boston GIS (Regina Obe, Leo Hsu)

AvidGeo LocationTech Tour Boston 2014, Synopsis

This past Monday we gave an introductory workshop on PostGIS. Our slides are here: PostGIS tutorial. Overall the event was very educational. I did have a butterfly feeling through out since we were the last to speak.

Talk slides

Talks were good, but most of the topics were not new to me so harder to appreciate. The talk I thought was the best was given by Steve Gifford on Geospatial on Mobile. Two things I liked about this talk was it was a topic I didn't know much about (developing native 3D globe map apps for iOS and Android) and Steve was a very funny speaker. That kind of funniness that looks unplanned and natural. I wish it had a bit more Android content in it though. The full list of talks and associated slides below.

Workshops

The workshops I think were the best part of the event. I particularly enjoyed Andrew Hill's CartoDB talk (PostGIS without the pain but with all the fun) mostly because I haven't used CartoDb so first I've seen it in action. Being able to drag and drop a zip file from your local computer or a url of data from some site and have a table ready to query I thought was pretty cool. You could also schedule it to check for updates if it was a url to a shapefile zip or somethng. Of course being able to write raw PostGIS spatial queries and a map app in 5 minutes was pretty slick.

Ours came after, and unfortunately I think it was pretty dry and too technical for most GIS folks. Plus we hadn't told people to download the files before hand so next to impossible to follow along. We should have called it PostGIS with the pain and on a shoestring budget.

by Regina Obe (nospam@example.com) at December 12, 2014 10:11 PM

December 06, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

Oracle FDW 1.1.0 with SDO_Geometry PostGIS spatial support

Oracle FDW is a foreign data wrapper PostgreSQL extension that allows you to read and write to Oracle database tables from a PostgreSQL database. You can get it via the PGXN network or the main website http://laurenz.github.io/oracle_fdw/.

What is new about the latest 1.1.0 release is that there is now support for the Oracle SDO_GEOMETRY type that allows you to map the most common geometry types POINT, LINE, POLYGON, MULTIPOINT, MULTILINE and MULTIPOLYGON to PostGIS geometry type. Much of the spatial plumbing work was done by Vincent Mora of Oslandia. If we have any Windows Oracle users out there, yes there are binaries available for windows for PostgreSQL 9.1- 9.4 for both 32-bit and 64-bit. The FDW does have a dependency on the OCI.dll which I think comes shipped with Oracle products. Unfortunately, we are not Oracle users so can't kick the tires.

by Leo Hsu and Regina Obe (nospam@example.com) at December 06, 2014 05:33 PM

December 02, 2014

Paul Ramsey

The Tyranny of Environment

Most users of PostGIS are safely ensconsed in the world of Linux, and their build/deploy environments are pretty similar to the ones used by the developers, so any problems they might experience are quickly found and removed early in development.

Some users are on Windows, but they are our most numerous user base, so we at least test that platform preemptively before release and make sure it is as good as we can make it.

And then there's the rest. We've had a passel of FreeBSD bugs lately, and I've found myself doing Solaris builds for customers, and don't get me started on the poor buggers running AIX. One of the annoyances of trying to fix a problem for a "weird platform" user is just getting the platform setup and running in the first place.

So, having recently learned a bit about vagrant, and seeing that some of the "weird" platforms have boxes already, I thought I would whip off a couple vagrant configurations so it's easy in the future to throw up a Solaris or FreeBSD box, or even a quick Centos box for debugging purposes.

I've just been setting up my Solaris Vagrantfile and using my favourite Solaris crutch: the OpenCSW software repository. But as I use it, I'm not just adding the "things I need", I'm implicitly choosing an environment:

  • my libxml2 is from OpenCSV
  • so is my gcc, which is version 4, not version 3
  • so is my postgres

This is convenient for me, but what are the chances that it'll be the environment used by someone on Solaris having problems? They might be compiling against libraries from /usr/sfw/bin, or using the Solaris gcc-3 package, or any number of other variants. At the end of the day, when testing on such a Solaris environment, will I be testing against a real situation, or a fantasyland of my own making?

For platforms like Ubuntu (apt) or Red Hat (yum) or FreeBSD (port) where there is One True Way to get software, the difficulties are less, but even then there is no easy way to get a "standard environment", or to quickly replicate the combinations of versions a user might have run into that is causing problems (libjson is a current source of pain). DLL hell has never really gone away, it has just found new ways to express itself.

(I will send a psychic wedgie to anyone who says "docker", I'm not kidding.)

by Paul Ramsey (noreply@blogger.com) at December 02, 2014 09:22 PM

November 30, 2014

Archaeogeek (Jo Cook)

PostGIS Day 2014

On 20th November this year we held the first PostGIS “Show-and-Tell” Day at the HQ of the British Computing Society. This was the first “official” OSGeo:UK event since hosting FOSS4G 2013 in Nottingham last year- it took us that long to recover from the stress!

We deliberately chose to make the event extremely light and informal, with the minimum of organisation, and this seemed to work pretty well. We had a range of speakers doing both longer talks and lightning talks, and then a session at the end where we discussed useful tools and add-ons for working with PostGIS.

The only caveats we placed on speakers were that the talk should (obviously) include PostGIS or PostgreSQL in some way, and that overt sales pitches would be considered uncool. Luckily our speakers obliged! There was also some really good audience interaction.

Highlights for me included Foreign Data Wrappers, for getting at information stored in other databases or behind other apis, but through PostgreSQL, and Raster processing. I was particularly excited by the Twitter FDW, but it turns out that it doesn’t work with the latest version of the twitter api, and it doesn’t look like a particularly well-maintained piece of code. You can see links to all the talks, and notes from the discussion that we had at the end, at our OSGeo:UK GitHub repository for the event: http://osgeouk.github.io/pgday/.

It transpired that there were a number of other events globally, also celebrating #PostGISDay, and perhaps next year it would be good to coordinate these better.

From an OSGeo:UK perspective, we have plans to hold more of these light-touch events, on a range of subjects (ideas gratefully received). Personally I would like to have charged a nominal fee for attending the event, not to make a profit but so that we could make a small donation back to PostGIS but charging money makes the whole process more complicated! We were extremely lucky in this case to get the venue for free, courtesy of the BCS Location Information Group, and to get the food sponsored by Astun Technology.

All in all, a really fun event, both to organise and to attend- thanks to all the speakers and attendees, and here’s to another event quite soon.

November 30, 2014 08:00 AM

November 27, 2014

Boston GIS (Regina Obe, Leo Hsu)

osm2pgrouting for Windows

For loading OSM data in a format already ready for pgRouting queries, the two common tools I've seen used are osm2po and osm2pgrouting. Unfortunately osm2pgrouting has been for a long time a Unix only tool mostly because no one tried to compile it for windows or test it on windows to see if it works. So this means that windows users who wanted a quick road to pgRouting nirvana had to use osm2po. Turns out the osm2pgrouting code compiles fine on windows (at least under mingw-w64 chain) and seems to work fine, so we should start making it available for windows.

This is still experimental, but if you are a windows user (or have a pgRouting windows user friend) and want to kick the tires a bit, you'll find osm2pgrouting for windows (both 32-bit and 64-bit) binaries on the PostGIS windows download page in the Unreleased section.

I compiled osm2pgrouting (for pgRouting 2.0) against PostgreSQL 9.4 using the same mingw-w64 chain I use to build PostGIS and pgRouting for windows -- clumsily detailed in my gist shell scripts - . Though it depends on libpq, it seems to work fine if you use it against the PostgreSQL 9.3 libpq. So just need to copy the additional files (osm2pgrouting and expat) into your PostgreSQL bin folder and you are good to go loading data into a pgRouting enabled database. It is assumed you already have PostGIS 2.1+ and pgRouting 2.0 already installed which come bundled together as part of the PostGIS 2.1+ windows installers and also available on the PostGIS windows download page.

Which should you use: osm2pgrouting or osm2po?

In many ways osm2po is superior to osm2pgrouting and has had more time invested in it. It can handle larger files on smaller machines, can work with the OSM pbf format, and it supports both Unix like and windows platforms. That said osm2po is not absolutely superior to osm2pgrouting. osm2po is inferior in a couple of ways that are important to me.

  • It's not open source, it's freeware -- which means you can't kick the code around and learn from it or inspect it. It also means you can't change it and contribute back your changes.
  • The primary motive of osm2po is not as a loading tool for pgRouting. osm2po is a complete routing engine with webserver which doesn't even rely on PostgreSQL. That said, if the maintainer decides one day providing a pgRouting export is no longer beneficial to osm2po development path, or pgRouting devs need to tweak it a bit to work with newer pgRouting, it will be difficult.
  • It's written in Java. This is my own personal dislike. Not that anything is wrong with Java, it's just sometimes a bit of a nuisance to get the VM going on a server I happen to get stuck on that I can't install anything I want. osm2pgrouting is much lighter weight not requiring a JVM, though it does require PostgreSQL,PostGIS, and pgRouting -- which you should have already.
  • osm2po generates an intermediary sql file which you then have to load via psql. I prefer the direct load approach of osm2pgrouting over having a big-o sql file to contend with after.

That said osm2po is great in many ways, but alternatives are always nice to have. Only annoying thing with having alternatives is deciding which alternative is best for what you are doing.

by Regina Obe (nospam@example.com) at November 27, 2014 11:34 PM

November 21, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

PostGIS Day synopsis

Yesterday was PostGIS day or as some may call it, Post GIS day and a couple of interesting things happened this day:

  • PostgreSQL 9.4 RC1 came out.
  • There were parties and unconferences, many summarized on http://2014.postgisday.rocks
  • I managed to entertain myself with a Conway's game of life PostGIS raster Map algebra style and pondered how wonderful PostGIS would be if it could generate animated gifs with some sort of aggregate function; to which I was politely called crazy by some of my fellow PSC friends.

But what was greatest of all and took the cake were these pictures:


Continue reading "PostGIS Day synopsis"

by Leo Hsu and Regina Obe (nospam@example.com) at November 21, 2014 09:16 PM

November 20, 2014

Boundless Geo

Happy PostGIS Day!

PostGISYesterday was GIS Day, which means that today is PostGIS Day — get it? Post-GIS Day! To celebrate, we're giving a 50% discount on online PostGIS training through the end of the week! Visit our training catalog and use promo code "postgis2014" to take advantage of this offer.

A lot has happened since last year, when I joined Stephen Mather, Josh Berkus, and Bborie Park in a PostGIS all-star hangout with James Fee.

In case you missed them, here are some features from our blog and elsewhere that highlight what's possible with PostGIS:

An, as always, be sure to check out our workshops for a slew of PostGIS-related courses, including Introduction to PostGIS and our Spatial Database Tips & Tricks.

Interested in learning about or deploying PostGIS? Boundless provides support, training, and maintenance for installations of PostGIS. Contact us to learn more.

The post Happy PostGIS Day! appeared first on Boundless.

by Paul Ramsey at November 20, 2014 01:54 PM

Boston GIS (Regina Obe, Leo Hsu)

PostGIS Day Game of Life celebration

This year's PostGIS day, I decided to celebrate with a little Conway's Game of Life fun inspired by Anita Graser's recent blog series Experiments with Game of Life. The path I chose to simulate the Game of life is a little different from Anita's. This variant exercises PostGIS 2.1+ raster mapalgebra and PostgreSQL 9.1+ recursive queries. Although you can do this with PostGIS 2.0, the map algebra syntax I am using is only supported in PostGIS 2.1+. My main disappointment is that because PostGIS does not yet support direct generation of animated gifs I had to settle for a comic strip I built unioning frames of rasters instead of motion picture. Hopefully some day my PostGIS animated gif dream will come true.


Continue reading "PostGIS Day Game of Life celebration"

by Regina Obe (nospam@example.com) at November 20, 2014 07:04 AM

November 18, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

FOSS4GNA 2015 and PGDay San Francisco March 2015

PGDay 2015 San Francisco will be held March 10th 2015 in Hyatt, San Franciso Airport, Burlingame, CA (Just outside of San Francisco). This year PGDay will be hosted along-side Free and Open Source Geospatial North America (FOSS4GNA) conference 2015 which runs March 9th-12th and EclipseCon NA 2015. Speaker submissions for FOSS4GNA 2015 and EclipseCon NA 2015 will end this Monday November 17th, 2014.


Continue reading "FOSS4GNA 2015 and PGDay San Francisco March 2015"

by Leo Hsu and Regina Obe (nospam@example.com) at November 18, 2014 03:46 AM

November 03, 2014

Boston GIS (Regina Obe, Leo Hsu)

AvidGeo Conference: LocationTech Tour Boston, Dec 8 2014

We'll be giving a PostGIS Introduction tutorial at AvidGeo Conference: LocationTech Tour Boston which will be held 8 December 2014 from 8:30 AM to 4:30 PM at:

Hack / Reduce
275 Third St
Cambridge, MA, 02142

Tickets are $40 for admission.

In addition to our PostGIS Intro tutorial, there is a great line-up of talks from visionaries such as Adena Schutzberg, talking: State of the Geospatial Industry and Raj Singh on the topic of The NoSQL Geo Landscape. Also check out other tutorials on CartoDb and QGIS.

by Regina Obe (nospam@example.com) at November 03, 2014 06:40 AM

October 26, 2014

Stephen Mather

Airspace — A deep rabbit hole

In previous maps we looked at Class B, C, and D airspace. Let’s add in Class E0 and E5… (not yet in 3D):

(Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL)

Map showing Class B, C, D, E0, and E5 airspace

Map showing Class B, C, D, E0, and E5 airspace

Previous posts:

https://smathermather.wordpress.com/2014/10/25/airspace-is-complicated-and-so-i-abuse-postgis-once-again/

and

https://smathermather.wordpress.com/2014/10/25/airspace-is-complicated-and-so-i-abuse-postgis-once-again-reprise/


by smathermather at October 26, 2014 02:16 AM

October 25, 2014

Stephen Mather

Airspace is complicated — and so I abuse PostGIS once again — Reprise…

In the previous post: https://smathermather.wordpress.com/2014/10/25/airspace-is-complicated-and-so-i-abuse-postgis-once-again/ we explore the 3D shape and complexity of controlled airspace.

Now here’s the rest of the code. We’ll add our affine transformation ala Seth Fitsimmons:

    SELECT 
        ST_Affine(
            ST_Rotate(geom, -pi() / 2),
            -- isometric
            cos(pi() / 6), -cos(pi() / 6), 0,
            sin(pi() / 6), sin(pi() / 6), 1,
            0, 0, 0,
            0, 0, 0
        )
    AS geom

And integrate that into our original function:

-- Inputs are a geometry and an elevation to move the geometry to
CREATE OR REPLACE FUNCTION threed_iso(footprint geometry, elevation numeric)
  RETURNS geometry AS
$BODY$

-- Force 3D, then translate to the input elevation
WITH floor AS
(
    SELECT ST_Translate( ST_Force3DZ(footprint), 0, 0, elevation ) AS geom
),
-- Now make isometric (and begin Seth Code)
iso AS
(
    SELECT 
        ST_Affine(
            ST_Rotate(geom, -pi() / 2),
            -- isometric
            cos(pi() / 6), -cos(pi() / 6), 0,
            sin(pi() / 6), sin(pi() / 6), 1,
            0, 0, 0,
            0, 0, 0
        )

    AS geom
    FROM floor
)
-- We'll force it back to 3D so QGIS is happy
SELECT ST_Force2D(geom) FROM iso
;
$BODY$
  LANGUAGE sql VOLATILE
  COST 100;

And voila!

DROP TABLE IF EXISTS class_c_isoc;

CREATE TABLE class_c_isoc AS
	SELECT gid, airspace, name, lowalt, highalt, threed_iso(geom, lowalt::numeric * 5) AS geom
	FROM class_c_subset;

Let’s take a look at Washington, DC and surrounds, another nice complicated example:

3D Figure of DC controlled airspace

3D Figure of DC controlled airspace

And again with map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL:

3D Figure of DC controlled airspace

3D Figure of DC controlled airspace

 


by smathermather at October 25, 2014 05:17 PM

Stephen Mather

Airspace is complicated — and so I abuse PostGIS once again

Let’s ignore for a moment the drone hobbiest / enthusiast. What is the shape of airspace for airplanes and commercial and government unmanned aircraft flying under Certificates of Authorization, and how can we visualize it?

Thanks to Anita in the last post, we have the Class B,C,D,E Airspace Shape Files which helps us define the overall shape of controlled airspace.

Map of Detroit, Cleveland, and Pittsburgh Class B Airspace

Map of Detroit, Cleveland, and Pittsburgh Class B Airspace

But, these are 3D things. I want to visualize them thus. Let us put some constraints on the problem. Let’s do it all in PostGIS, that way we can see it in QGIS or on the web. Let’s not use full PostGIS 3D (i.e. the SFCGAL library), not because it isn’t awesome (it truly is) but because it can be hard to install at the moment (although see https://github.com/vpicavet/docker-pggis for an easy way with docker). Finally, true 3D with unconstrained viewing angles and 100% flexibility is… is… well it usually sucks. So, we’ll stick to isometric viewing (thanks to Seth Fitzsimmons of Stamen http://stamen.com/ for his PostGIS isometric code which will be released upon his word). (Update — all the code is there…):

-- Inputs are a geometry and an elevation to move the geometry to
CREATE OR REPLACE FUNCTION threed_iso(footprint geometry, elevation numeric)
  RETURNS geometry AS
$BODY$

-- Force 3D, then translate to the input elevation
WITH floor AS
(
    SELECT ST_Translate( ST_Force3DZ(footprint), 0, 0, elevation ) AS geom
),
-- Now make isometric (and begin Seth Code)
iso AS
(
    SELECT
        ST_Affine(
            ST_Rotate(geom, -pi() / 2),
            -- isometric
            cos(pi() / 6), -cos(pi() / 6), 0,
            sin(pi() / 6), sin(pi() / 6), 1,
            0, 0, 0,
            0, 0, 0
        )

    AS geom
    FROM floor
)
-- We'll force it back to 3D so QGIS is happy
SELECT ST_Force2D(geom) FROM iso
;
$BODY$
  LANGUAGE sql VOLATILE
  COST 100;

Ok, now let’s rock some geometries with this bad function:

DROP TABLE IF EXISTS class_c_isoc;

CREATE TABLE class_c_isoc AS
	SELECT gid, airspace, name, lowalt, highalt, threed_iso(geom, lowalt::numeric * 5) AS geom
	FROM class_c_subset;

And what do our controlled airspaces look like?

Isometric view of Cleveland controlled airspace

Isometric view of Cleveland controlled airspace

Kind of conical, in this case with some “wings” that serve as approaches. It makes sense, I guess. At the center, where the airport is, controlled airspace goes from the ground up. As we get further from the airport, a set of concentric rings starting at higher elevations provide a controlled space that allows for landing and taking off.

Image of Cleveland airspace with Stamen Toner map as backdrop

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.

There are more complicated ones, naturally. We need look no further than Detroit for additional complexity:

Visualization of Detroit controlled airspace

Visualization of Detroit controlled airspace

airspace_detroit_toner

No wonder every time I fly through Detroit, no matter where I’m going, I end up flying over Lake Erie.

If we want really complicated, we need look no further than Cincinnati:
airspace_cinci

What is going on there? Is that due to shape of the hills, or city boundaries?

Finally, what does airspace look like over Ohio, West Virginia, and Pennsylvania (etc.), overall?
airspace_is_complicated_regional

And while the following map isn’t quite right, here is a figure including many of the small airports sans controlled airspace:

View of all controlled and uncontrolled airspace across Ohio and neighbors.

View of all controlled and uncontrolled airspace across Ohio and neighbors.

May an aeronautical pox be upon you!
The above line was not intended in bad taste, but just an homage to the “red dot fever” of early neo-geography (neo-geography which I’m informed definitionally doesn’t exist). Only a few minutes ago, it dawned on me that the deleted phrase could be misinterpreted these days… .

On a related note, if you want an interesting analysis of ebola, read Zeynep Tufekci’s analysis.

(for the record, all heights are exagerated by 5x, for clarity of reading).

Also, in case it wasn’t obvious: Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.


by smathermather at October 25, 2014 06:13 AM

October 20, 2014

Oslandia

Oslandia releases "Cuardo", 3D GIS viewer


Oslandia announces today the preliminary result of one of its research and development project : Cuardo .



Cuardo is an OpenSource WebGL 3D data viewer, focusing on urban data analysis and visualization.



Cuardo features :

  • 3D terrain data model
  • Multi-textured terrain (satellite, thematic)
  • Draping of 2D vector data
  • Textured 3D meshes
  • Custom styling, with 3D effects (extrusion…)
  • Level of detail management
  • Classic 3D user navigation
  • WebServices integration ( WMS, WFS )
  • Identify tool with custom actions (popup, links…)
  • Focus on performances
  • High customization possibilities

The specificity of this tool is that it deals really well with 3D features as objects, including LOD management, whereas most 3D viewers cannot distinguish objects and make it hard for object picking. Being able to visualize a full city in your browser without any plugin to install, and interact with other client application, makes Cuardo a good option for application development requiring 3D visualization, which you can mix with other great web libraries.


Cuardo is on GitHub, get it, fork it, PR it : https://github.com/Oslandia/cuardo


This software component is part of our 3D software stack. This Cuardo release is a first public release, and it is still under heavy development, but already usable for interesting use cases, as you can see in the video above.


We actively work on a full-featured 3D software solution to focus on 3D geographical data management. Our aim is to provide a full set of components allowing to :

  • Import 3D data
  • transform 2D data into 3D data
  • Store high volumes of data
  • Perform 3D analysis
  • Combine 3D data with - elevation models - point clouds - Raster data - 2D vector data - alphanumeric data
  • Setup webservices for data diffusion, including 3D data
  • Visualize the data in a browser

The release of Cuardo web client is the missing piece for a vertical 3D GIS data management stack.


Oslandia has a strong expertise on 3D GIS, be it on the server side, or as stated by this anouncement, on the client side.


Oslandia is commited to improving this software solution, adapting it and making it fit your specific use cases. Do not hesitate to get in touch if you want some more information on how to take advantage of your 3D spatial data in the best way. We will be delighted to hear your use cases : infos+3d@oslandia.com .


We would like to thank the Grand Lyon metropolitan area, for distributing a lot of GIS data as opendata, including textured 3D models : http://www.smartdata.grandlyon.com

by Vincent Picavet at October 20, 2014 08:00 AM

October 09, 2014

Boston GIS (Regina Obe, Leo Hsu)

FOSS4G NA 2015 Call for Speakers now Open

FOSS4G NA 2015 (Free and Open Source Geospatial North America conference) will be running March 9th-12th 2015 In Burlingame, California (Hyatt, San Francisco Airport). Call for talks and workshops is now open and we've already got some submissions to review. Deadline for submission is November 17th 2014. Presenters including (workshop and those giving 35-minute standard talk) get to attend the 3-days general sessions for free. The conference consists of one day of workshops and three days of general sessions, with lightning talks on some or all of of the general session days.

Details about presentations here: CFP details. The FOSS4G NA 2015 is a collaborative event co-sponsored by OSGeo and LocationTech

FOSS4G NA 2015 will be co-hosted with EclipseCon North America 2015. Admission to FOSS4G NA 2015 gives you admission to EclipseCon NA 2015. So get 2 conferences for the price of one and go conference hopping.

For PostgreSQL and PostGIS folks there is a likely chance that there will be a PostgreSQL day at the same venue during the same period. So we might have a very special PostGIS / PostgreSQL day. I'll update as more details of that surface.

by Regina Obe (nospam@example.com) at October 09, 2014 03:02 AM

October 01, 2014

Boundless Geo

PostGIS Training: Creating Overlays

PostGIS training At Boundless, we recognize that software is only as good as the problems it can be used to solve. That's why we don't just develop the best open source spatial software, we also teach you how to use it with supporttraining, and workshops directly from the experts.

One question that comes up often during our PostGIS training is "how do I do an overlay?" The terminology can vary: sometimes they call the operation a "union" sometimes an "intersect". What they mean is, "can you turn a collection of overlapping polygons into a collection of non-overlapping polygons that retain information about the overlapping polygons that formed them?"

screenshot_67

So an overlapping set of three circles becomes a non-overlapping set of 7 polygons. screenshot_69

Calculating the overlapping parts of a pair of shapes is easy, using the ST_Intersection() function in PostGIS, but that only works for pairs, and doesn't capture the areas that have no overlaps at all. How can we handle multiple overlaps and get out a polygon set that covers 100% of the area of the input sets? By taking the polygon geometry apart into lines, and then building new polygons back up. Let's construct a synthetic example: first, generate a collection of random points, using a Gaussian distribution, so there's more overlap in the middle. The crazy math in the SQL below just converts the uniform random numbers from the random() function into normally distributed numbers.

CREATE TABLE pts AS
WITH rands AS (
  SELECT generate_series as id, random() AS u1, random() AS u2 FROM generate_series(1,100)
)
SELECT
  id,
  ST_SetSRID(ST_MakePoint(
    50 * sqrt(-2 * ln(u1)) * cos(2*pi()*u2),
    50 * sqrt(-2 * ln(u1)) * sin(2*pi()*u2)),4326) AS geom
FROM rands;

The result looks like this: screenshot_70 Now, we turn the points into circles, big enough to have overlaps.

CREATE TABLE circles AS
SELECT id, ST_Buffer(geom, 10) AS geom FROM pts;

Which looks like this: screenshot_71 Now it's time to take the polygons apart. In this case we'll take the exterior ring of the circles, using ST_ExteriorRing(). If we were dealing with complex polygons with holes, we'd have to use ST_DumpRings(). Once we have the rings, we want to make sure that everywhere rings cross the lines are broken, so that no lines cross, they only touch at their end points. We do that with the ST_Union() function.

CREATE TABLE boundaries AS
SELECT ST_Union(ST_ExteriorRing(geom)) AS geom
FROM circles;

What comes out is just lines, but with end points at every crossing. screenshot_72 Now that we have noded lines, we can collect them into a multi-linestring and feed them to ST_Polygonize() to generate polygons. The polygons come out as one big multi-polygon, so we'll use ST_Dump() to convert it into a table with one row per polygon.

CREATE SEQUENCE polyseq;
CREATE TABLE polys AS
SELECT nextval('polyseq') AS id, (ST_Dump(ST_Polygonize(geom))).geom AS geom
FROM boundaries;

Now we have a set of polygons with no overlaps, only one polygon per area. screenshot_73 So, how do we figure out how many overlaps contributed to each incoming polygon? We can join the centroids of the new small polygons with the set of original circles, and calculate how many circles contain each centroid point. screenshot_74 A spatial join will allow us to calculate the number of overlaps.

ALTER TABLE polys ADD COLUMN count INTEGER DEFAULT 0;
UPDATE POLYS set count = p.count
FROM (
  SELECT count(*) AS count, p.id AS id  
  FROM polys p 
  JOIN circles c 
  ON ST_Contains(c.geom, ST_PointOnSurface(p.geom)) 
  GROUP BY p.id
) AS p
WHERE p.id = polys.id;

That's it! Now we have a single coverage of the area, where each polygon knows how much overlap contributed to it. Ironically, when visualized using the coverage count as a variable in the color ramp, it looks a lot like the original image, which was created with a simple transparency effect. However, the point here is that we've created new data, in the count attribute of the new polygon layer. screenshot_75 The same decompose-and-rebuild-and-join-centroids trick can be used to overlay all kinds of features, and to carry over attributes from the original input data, achieving the classic "GIS overlay" workflow. Happy geometry mashing!

Want to learn more? Try our Introduction to PostGIS online training course!

The post PostGIS Training: Creating Overlays appeared first on Boundless.

by Paul Ramsey at October 01, 2014 03:33 PM

September 30, 2014

Boston GIS (Regina Obe, Leo Hsu)

Waiting for PostGIS 2.2 - ST_ClipByBox2D - Map dicing Redux in concert with ST_Tile

One of the new features coming in PostGIS 2.2 is ST_ClipByBox2D (thanks to Sandro Santilli's recent commits funded by CartoDB ). However to take advantage of it, you are going to need your PostGIS compiled with GEOS 3.5+ (very recent build) which has not been released yet. Windows folks, the PostGIS 2.2 9.3 and 9.4 experimental binaries are built with the latest GEOS 3.5 development branch, so you should be able to test this out with Winnie's experimental builds.

Since the dawn of PostGIS, PostGIS users have needed to mutilate their geometries in often painful and horrific ways. Why is ST_ClipByBox2D function useful, because its a much faster way of mutilating your geometries by a rectangular mold than using ST_Intersection. Why would you want to mutilate your geometries? There are many reasons, but I'll give you one: As your geometry approaches the area of your bounding box and as your bounding box size decreases, the more efficient your spatial index becomes. You can consider this article, Map dicing redux of the article I wrote (eons ago) - Map Dicing and other stuff which describes the same approach with much older technology. Though I will be using more or less the same dataset Massachusetts TOWNSSURVEY_POLYM (its newer so you can't really compare) and I tried to simplify my exercise a bit (not resorting to temp tables and such), my focus in this article will be to compare the speed between the new ST_ClipByBox2D approach and the old ST_Intersection approach. The spoiler for those who don't have the patience for this exercise is that using ST_ClipByBox2D at least on my sample data set on my puny Windows 7 64-bit desktop using PostgreSQL 9.4beta2 64-bit was about 4-5 times faster than using ST_Intersection. There was a downside to this speedup. With the ST_Intersection approach, I had no invalid polygons. In the case of ST_ClipByBox2D, I had one invalid polygon. So as noted in the docs, use with caution. We'd be very interested in hearing other people's experiences with it.

One other benefit that ST_ClipByBox2D has over ST_Intersection which I didn't test out is that although ST_Intersection doesn't work with invalid geometries, ST_ClipByBox2D can.

For these exercises, I'm going to also abuse the PostGIS raster function ST_Tile for geometry use, cause for some strange reason, we have no ST_Tile function for PostGIS geometry. For those who missed the improper use of ST_Tile for raster, refer to Waiting for PostGIS 2.1 - ST_Tile.


Continue reading "Waiting for PostGIS 2.2 - ST_ClipByBox2D - Map dicing Redux in concert with ST_Tile"

by Regina Obe (nospam@example.com) at September 30, 2014 02:55 PM

September 26, 2014

PostGIS Development

Getting distinct pixel values and pixel value counts of a raster

PostGIS raster has so so many functions and probably at least 10 ways of doing something some much much slower than others. Suppose you have a raster, or you have a raster area of interest — say elevation raster for example, and you want to know the distinct pixel values in the area. The temptation is to reach for ST_Value function in raster, but there is a much much more efficient function to use, and that is the ST_ValueCount function.

ST_ValueCount function is one of many statistical raster functions available with PostGIS 2.0+. It is a set returning function that returns 2 values for each row: a pixel value (value), and a count of pixels (count) in the raster that have that value. It also has variants that allow you to filter for certain pixel values.

This tip was prompted by the question on stackexchange How can I extract all distinct values from a PostGIS Raster?

Continue Reading by clicking title hyperlink ..

by Regina Obe at September 26, 2014 12:00 AM

September 25, 2014

Archaeogeek (Jo Cook)

New Portable GIS Releases

I’m pleased to announce not one, but two new releases of Portable GIS! The first, version 4.2, contains QGIS 2.4 and PostGIS 1.5 and will be the last release to include that version of PostGIS.

The second, version 5, contains QGIS 2.4 and PostGIS 2.1 and all future releases will be based on this. Get them here:

There are two important things about these two releases. The first is that the download size is almost twice what it was previously- this is out of my control and is due to the increased size of the QGIS installation. The second is that version 5.0 should be considered a beta release as it’s very much un-tested. If there are any problems let me know via the Portable GIS google group.

You might also be interested to see a short presentation I did about Portable GIS at GeoMob London in September: http://archaeogeek.github.io/geomob-portablegis/#/

September 25, 2014 08:49 AM

September 24, 2014

Oslandia

Back from FOSS4G 2014 - part 2



Apart from Oslandia's participation at FOSS4G 2014 in Portland ( See part 1), a lot of topics have been discussed during this week of conference.


Presentation's slides and video are being put online and you can replay the whole conference if you could not attend, but here is a really short summary of a few things we noticed during the five days.

  • Fiona and Rasterio are now mature modern GIS Python libraries on top of GDAL/OGR, and are continuously improved.
  • WebGL is all around the place. With new support from the various browser's editors, this is definitly the backend of choice for large data rendering on online maps. This is still the exploration phase though, and there is currently no definite solution for webgl mapping, as things are evolving really fast and use cases appear.
  • Use cases for Opensource GIS software continue to grow larger. It even goes to space with PlanetLabs satellite constellation !
  • OpenLayers 3 and Leaflet are the two client libraries of choice, with distinct use cases, but an excellent quality for both
  • Design is here, for the good sake of our eyes and user experience. This is the first time that a specific track is fully dedicated to maps design.
  • Vector strikes back ! With the evolution of browser's technologies, vector data is gaining momentum. Vector tiles are a natural evolution for this, and formats and standards start to emerge. More performance, less space, and a door open for new end-user features
  • Following the vector movement, symbology also evolves to become dynamic. What if those colors nicely blend from one to another, and the thickness of these lines are controlled by a function of the zoom level ? Better UX !
  • GIS involves large dataset, and opensource software are now ready to crunch these teràbytes of data. Be it for mosaïcking rasters for the whole world, machine learning to predict climate change, or distributed data analysis, data size limits continue to be pushed further.
  • Well-known OpenSource components continuously gain features : Mapserver, QGIS, GDAL/OGR, PostGIS improve from version to version
  • CartoDB raised 8 million USD and continues to grow
  • A large part of the Mapbox team was present and showed various technologies which are at the bleeding edge of online mapping
  • A lot of various large organization were present, and we can notice TriMet, US DoD, Amazon (sponsor), GitHub, ESRI, RedHat… Big players are in the place too

FOSS4G is technical and at the same time user-oriented. It gathers the opensource community and industrials as well as academics. This is the best way to foster an intense brainstorming and create new ideas, visions and technologies.


Do not hesitate to check the videos, and give us your favorites, twitting @oslandia or mail us on any of these subjects at infos@oslandia.com !

by Vincent Picavet at September 24, 2014 05:00 PM

September 23, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

FOSS4G and PGOpen 2014 presentations

At FOSS4G we gave two presentations. The videos from other presentations are FOSS4G 2014 album. I have to commend the organizers for gathering such a rich collection of videos. Most of the videos turned out very well and are also downloadable in MP4 in various resolutions. It really was a treat being able to watch all I missed. I think there are still some videos that will be uploaded soon. As mentioned lots of PostGIS/PostgreSQL talks (or at least people using PostGIS/PostgreSQL in GIS).


Continue reading "FOSS4G and PGOpen 2014 presentations"

by Leo Hsu and Regina Obe (nospam@example.com) at September 23, 2014 06:55 PM

Oslandia

Back from FOSS4G 2014 - part 1



FOSS4G , the main international OSGeo event, takes place once per year and gather all folks involved in opensource geospatial technologies at large.

This year, the conference took place in Portland, Oregon, and gathered around 1000 people, on very various topics ranging from deeply technical Mapserver pro tips, to burnout management, through opendata policies and map designs.

This year's edition was really good, as always for this conference. A lot of social events allowed direct talks to passionate people from all over the world, and the presentations were diverse and of high quality.

Oslandia at FOSS4G

Oslandia is a full player in the OpenSource GIS field, and we participated this year with various interventions.


First of all, we gave a workshop on our 3D stack, with new features and a now vertical stack of software, from storage to visualization.

This workshop features the use of this stack for managing 3D GIS data. Given an opendata dataset from the city of Lyon, France , we integrate the data into PostGIS, and use PostGIS 3D features to analyze and enrich this data. Then, Horao ( http://oslandia.github.io/horao/ ), a 3D destktop viewer coupled with QGIS, is used to do some 3D visualization.


In a second part of the workshop, we use the very recently released Cuardo web client ( http://www.cuardo.org ) to display the data in a standard browser, using WebGL.



The workshop is self-contained and you can follow it completely to discover these new exciting features in OpenSource 3D GIS. find it online : https://github.com/Oslandia/workshop-3d

Gimme some YeSQL !

Vincent Picavet also did a presentation "Gimme some YeSQL - and a GIS -", which focused on latest and upcoming PostgreSQL features. From updateable views to the new JSONB data type, this is a practical and technical presentation of the big improvements coming in PostgreSQL 9.4. The talk illustrates the use of these features in a GIS context.

Slides are online As well as the video (the title is wrong).



OpenSource software for surveying

Another presentation by Vincent was an insight on a project we achieved for GIS Apavil in Romania, and concerns a full GIS for water distribution network and wastewater networks. The presentation especially shows how we setup some specific opensource surveying tools, and the way we deal with offline, versioning and synchronization of data.

Slides are on GitHub and the video should be up soon.

3D GIS

Olivier Courtin showed a summary of the current state of the 3D stack we develop and its foreseen future. Slides and video are online, and there should be more information coming soon.


Slides : https://github.com/Oslandia/presentations/raw/master/foss4g_2014/gisgoes3d_olivier_courtin.pdf



Get in touch !

Oslandia will continue to make research and development on all of these topics. If you are interested in any of these subjects, do not hesitate to get in touch at infos@oslandia.com


A post to follow will give you some background information on what we heard at FOSS4G and where to hear more !

by Vincent Picavet at September 23, 2014 05:00 PM

September 22, 2014

Paul Ramsey

PostGIS Feature Frenzy

A specially extended feature frenzy for FOSS4G 2014 in Portland. Usually I only frenzy for 25 minutes at a time, but they gave me an hour long session!

PostGIS Feature Frenzy — Paul Ramsey from FOSS4G on Vimeo.

Thanks to the organizers for giving me the big room and big slot!

by Paul Ramsey (noreply@blogger.com) at September 22, 2014 11:30 PM

September 16, 2014

Paul Ramsey

PostGIS for Managers

At FOSS4G this year, I wanted to take a run at the decision process around open source with particular reference to the decision to adopt PostGIS: what do managers need to know before they can get comfortable with the idea of making the move.

The Manager's Guide to PostGIS — Paul Ramsey from FOSS4G on Vimeo.

by Paul Ramsey (noreply@blogger.com) at September 16, 2014 07:48 PM

September 11, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

FOSS4G 2014 televised live

If you weren't able to make it to FOSS4G 2014 this year, you can still experience the event Live. All the tracks are being televised live and its pretty good reception. https://2014.foss4g.org/live/. Lots of GIS users using PostGIS and PostgreSQL. People seem to love Node.JS too.

After hearing enough about Node.JS from all these people, and this guy (Bill Dollins), I decided to try this out for myself.

I created a node.js web application - which you can download from here: https://github.com/robe2/node_postgis_express . It's really a spin-off from my other viewers, but more raw. I borrowed the same ideas as Bill, but instead of having a native node Postgres driver, I went for the pure javascript one so its easier to install on all platforms. I also experimented with using base-64 encoding to embed raster output directly into the browser so I don't have to have that silly img src path reference thing to contend with.

by Leo Hsu and Regina Obe (nospam@example.com) at September 11, 2014 08:37 PM

September 10, 2014

A GeoSpatial World

Transformation de formats 3D (OBJ, PLY, STL, VTP) vers PostGIS WKT POLYHEDRALSURFACE


3D (OBJ,PLY,STL,VTP) vers WKT POLYHEDRALSURFACE

image

image_thumb175_thumb ntroduction

L’idée de l’outil que je vais vous présenter est venu de la lecture d’un message dans la liste PostGIS-User polyhedral surface import from common 3D formats , l’auteur du message recherche un outil permettant de traduire quelques formats 3D (OBJ, PLY, STL ..) vers le format WKB/WKT pour une géométrie de type POLYHDERAL SURFACE et je me suis dis alors pourquoi ne pas développer un tel outil, le viewer 3D déjà réalisé me permettant de valider la transformation.
J’ai développé cet outil en C# 2013 avec :
  • ActiViz .Net un wrapper pour VTK
  ActiViz .Net


vtk-logo_thumb2 VTK
  • ScintillaNET un puissant contrôle d’édition de textes pour Windows et un wrapper pour le composant Scintilla.
image ScintillaNET
Scintilla
J’ai ensuite recherché sur internet des fichiers exemple pour chaque format de données 3D que j’ai inclus dans le répertoire Data de l’installation de l’outil. Voici quelques liens ou vous trouverez des fichiers à télécharger :
  • OBJ
OBJ Files
  • PLY
PLY Files
  • STL
STL Files Sharing
 

image image
image image image

image_thumb174_thumb nstallation


image_thumb41 Cliquez sur le lien pg3DImport_setup.exe pour télécharger le programme d’installation, puis lancer l’exécution.
Cliquer sur le  bouton Suivant à chaque étape, puis sur le bouton Terminer.
image
image
image
image
image
image

image_thumb179_thumb tilisation


image Double cliquez sur l’icône créé par le programme d’installation pour lancer l’application.
image
image

Ouvrir et transformer un fichier 3D

image Cliquez sur cet icône pour choisir un fichier 3D à transformer.
Allez dans le répertoire d’installation de Pg3DImport puis le sous-répertoire DATA, puis sélectionnez le type de fichier 3D que vous souhaitez transformer OBJ Files par exemple puis sélectionnez le fichier cesna.obj et cliquez sur le bouton Ouvrir.

image
Vous pourrez alors visualiser le fichier dans la partie viewer.

image
image Cliquez sur cet icône pour obtenir la transformation vers le format POLYHEDRALSURFACE WKT.


image

Vous pouvez alors copier le résultat dans la partie éditeur pour l’insérer dans une table PostGIS (version 2.0 ou supérieur) ou bien visualiser le résultat dans l’application Pg3DViewer que vous avez certainement déjà installée.

image
image
Visualisation dans Pg3DViewer

image

Un autre exemple

Ouverture d’un fichier STL dans le répertoire DATA de l’installation.
image
Affichage dans Pg3DViewer.

image
Vous pouvez ouvrir et convertir les différents fichiers présents dans le répertoire DATA de l’installation, mais si vous avez des fichiers 3D, utilisez plutôt ceux-ci.

image_thumb100_thumb1 onclusion.

Nous sommes arrivés à la fin de ce billet, un pas de plus dans le nouveau monde 3D de PostGIS, n’hésitez pas à me faire part de vos idées d’amélioration.

by Jérôme ROLLAND (noreply@blogger.com) at September 10, 2014 09:38 AM

A GeoSpatial World

Transformation de formats 3D (OBJ, PLY, STL, VTP) vers PostGIS WKT POLYHEDRALSURFACE

 

3D (OBJ,PLY,STL,VTP) vers WKT POLYHEDRALSURFACE

image

image_thumb175_thumb ntroduction

L’idée de l’outil que je vais vous présenter est venu de la lecture d’un message dans la liste PostGIS-User polyhedral surface import from common 3D formats , l’auteur du message recherche un outil permettant de traduire quelques formats 3D (OBJ, PLY, STL ..) vers le format WKB/WKT pour une géométrie de type POLYHDERAL SURFACE et je me suis dis alors pourquoi ne pas développer un tel outil, le viewer 3D déjà réalisé me permettant de valider la transformation.

J’ai développé cet outil en C# 2013 avec :

  • ActiViz .Net un wrapper pour VTK
  ActiViz .Net


vtk-logo_thumb2 VTK
  • ScintillaNET un puissant contrôle d’édition de textes pour Windows et un wrapper pour le composant Scintilla.
image ScintillaNET
  Scintilla

J’ai ensuite recherché sur internet des fichiers exemple pour chaque format de données 3D que j’ai inclus dans le répertoire Data de l’installation de l’outil. Voici quelques liens ou vous trouverez des fichiers à télécharger :

  • OBJ

OBJ Files

  • PLY

PLY Files

  • STL
STL Files Sharing

 

image image  
image image image

 

image_thumb174_thumb nstallation

 

image_thumb41

Cliquez sur le lien pg3DImport_setup.exe pour télécharger le programme d’installation, puis lancer l’exécution.

Cliquer sur le  bouton Suivant à chaque étape, puis sur le bouton Terminer.

  image
  image
  image
  image
  image
  image

 

image_thumb179_thumb tilisation

 

image Double cliquez sur l’icône créé par le programme d’installation pour lancer l’application.
  image
  image

Ouvrir et transformer un fichier 3D

image Cliquez sur cet icône pour choisir un fichier 3D à transformer.
  Allez dans le répertoire d’installation de Pg3DImport puis le sous-répertoire DATA, puis sélectionnez le type de fichier 3D que vous souhaitez transformer OBJ Files par exemple puis sélectionnez le fichier cesna.obj et cliquez sur le bouton Ouvrir.

image
  Vous pourrez alors visualiser le fichier dans la partie viewer.

image
image Cliquez sur cet icône pour obtenir la transformation vers le format POLYHEDRALSURFACE WKT.


image

Vous pouvez alors copier le résultat dans la partie éditeur pour l’insérer dans une table PostGIS (version 2.0 ou supérieur) ou bien visualiser le résultat dans l’application Pg3DViewer que vous avez certainement déjà installée.

 
image
  image
  Visualisation dans Pg3DViewer

image

Un autre exemple

  Ouverture d’un fichier STL dans le répertoire DATA de l’installation.
image
  Affichage dans Pg3DViewer.

image

Vous pouvez ouvrir et convertir les différents fichiers présents dans le répertoire DATA de l’installation, mais si vous avez des fichiers 3D, utilisez plutôt ceux-ci.

 

image_thumb100_thumb1 onclusion.

Nous sommes arrivés à la fin de ce billet, un pas de plus dans le nouveau monde 3D de PostGIS, n’hésitez pas à me faire part de vos idées d’amélioration.

by Jérôme ROLLAND (noreply@blogger.com) at September 10, 2014 07:55 AM

PostGIS Development

PostGIS 2.1.4 Released

The 2.1.4 release of PostGIS is now available.

The PostGIS development team is happy to release patch for PostGIS 2.1, the 2.1.4 release. As befits a patch release, the focus is on bugs, breakages, and performance issues

http://download.osgeo.org/postgis/source/postgis-2.1.4.tar.gz

Continue Reading by clicking title hyperlink ..

by Regina Obe at September 10, 2014 12:00 AM

September 09, 2014

Safe Software

Free and Open Source: The Inside Scoop on FME and FOSS

“Open source tools play well with others.” Paul Ramsey, PostGIS founder, talks Open Source at the FME International User Conference 2014.

“Open source tools play well with others.” Paul Ramsey, PostGIS founder, talks Open Source at the FME International User Conference 2014.

As a software developer, coding something that’s already been designed, implemented, and optimized is a pretty gigantic waste of time.

Open source libraries help developers avoid re-inventing wheels. At this year’s FME UC, PostGIS founder Paul Ramsey took the stage to talk about the awesomeness that is Open Source. He explained how it’s important that developers use available knowledge—and FME devs understand this, adding value by building useful functionality on top of open source libraries.

Safe Software proudly supports the Free and Open Source Software (FOSS) community and regularly sponsors the North American FOSS4G conferences. FME uses many FOSS libraries, and when the devs here at Safe aren’t playing soccer or LEGO or a game of oversized Jenga, they’re working hard to make these libraries more accessible.

Our use of these libraries is a friendly two-way street. A two-way Yellow Brick Road, if you will. FOSS adds value to FME’s advanced data transformation and automation capabilities, while our contributions to these technologies have included funding, coding, testing, and bug fixes. Here are ten such libraries.

10 FOSS Libraries that FME Uses and Contributes to

This Mario-like map is made possible by the integration of Mapnik and FME.

This Mario-like map is made possible by the integration of Mapnik and FME.

1. Mapnik

Mapnik is a toolkit that renders raster and vector data into beautiful cartographic maps. FME’s Mapnik transformer lets users leverage Mapnik’s capabilities with any of the 325+ data formats FME supports. With Mapnik and FME, you can also create Web Map Tiling Services, make a map out of your face, map yer way to buried treasure arr matey!, and other such highly critical tasks. FME’s intuitive graphical interface makes it easy for any user to benefit from this complex library.

2. SpatiaLite

SpatiaLite extends SQLite into a simple Spatial DBMS. FME’s SpatiaLite Reader / Writer allows users to connect this format with hundreds of other formats, and leverage all of FME’s data transformation and automation capabilities. For example, you can instantly import SpatiaLite data into Google Maps Engine using our free Data Loader for Google Maps Engine.

FME reads and writes a variety of open source databases.

FME reads and writes a variety of open source databases.

3. PostGIS

PostGIS is a spatial database that adds support for geographic objects and location queries to PostgreSQL. FME’s PostGIS Reader / Writer lets users integrate hundreds of data sources with this format. Watch Paul Ramsey’s FME UC talk to see the evolution of PostGIS, from the ideas that inspired it to its widespread use today.

4. JTS and GEOS

JTS and GEOS (Geometry Engine – Open Source) provide spatial predicates and functions for processing 2D geometry. JTS is the Java library and GEOS is the C++ library. Some of FME’s powerful geometry transformations are based on GEOS/JTS algorithms.

5. OGR and GDAL (Geospatial Data Abstraction Library)

These OSGeo libraries support reading/writing over 50 raster formats (GDAL) and 20 vector formats (OGR). FME uses these libraries to add several readers and writers to its list of supported formats, which means these formats gain a wide range of enterprise transformation and automation capabilities.

Bonus: if you use a GDAL-based format in FME, you can feel comfortable knowing it was probably implemented by our in-house world Tetris champion.

6. FDO

FDO works with a variety of geospatial data. This OSGeo API brings additional formats to FME, for example SQLite. In addition, Safe offers the free FME FDO Provider to allow AutoCAD Map 3D users to read 9 additional data formats. Safe was one of the earliest adopters of this API (version 3.0) and is acknowledged as an official Product Using FDO.

7. Ingres

Ingres is an RDBMS that supports large commercial and government applications. Safe’s partnership with Actian resulted in an open-source FME reader/writer for Ingres tables. This means increased data accessibility for Ingres users, making it easy to integrate spatial and enterprise data.

FME Workbench and the FME Data Inspector are available on Linux.

FME Workbench and the FME Data Inspector are available on Linux.

8. Qt

Qt is an application framework that we’ve used to help make FME fully cross-platform. Since FME 2013, FME has been available in beta on Mac and Linux platforms.

9. Interlis

Interlis is a geodata modeling and integration language adopted by the Swiss government as their national standard. Eisenhut Informatik is one of Safe’s longest standing open source partners, and their ili2fme plug-in allows Interlis data to be translated to/from any of the formats supported by FME.

10. LASzip

LASzip is a specialized library by Martin Isenburg that compresses LiDAR data without information loss—turning chubby LAS point clouds into lean, manageable LAZ files. The incorporation of this technology into FME has resulted a great improvement in the ability to archive and exchange point cloud data.

But Wait, There’s More

For the full list of libraries licensed under the LGPL, visit our FOSS Code Center. You can also view more information about FME’s open source libraries by opening FME Desktop and selecting About > Legal Notices.

FME for Free

Data Loader for Google Maps Engine

The Data Loader for Google Maps Engine is one free tool we offer for converting data with FME technology.

FME is FOSS! Sort of. We do offer some free data translation tools.

In addition to the FME FDO Provider mentioned above, we also offer the Data Loader for Google Maps Engine, the CityGML Importer for InfraWorks, and the FME Revit Exporter Plug-in.

The Easy Translator is also available as a free web service, for immediately translating data into your required format and coordinate system.

Moving forward, we aim to make more free tools and web services like these.

banner_fmerocks

How have you integrated your software or project with Free and Open Source Software? Are you attending FOSS4G this year, or have you in the past? Let us know in the comments!

The post Free and Open Source: The Inside Scoop on FME and FOSS appeared first on Safe Software Blog.

by Tiana Warner at September 09, 2014 06:19 PM

A GeoSpatial World

Pg3DViewer nouvelle version

 

Pg3DViewer nouvelle version

image

J’ai réalisé une nouvelle version de l’outil Pg3DViewer qui intègre ScintillaNET un puissant contrôle d’édition de textes avec coloration syntaxique pour Windows et un wrapper pour le composant Scintilla. J’ai utilisé ce contrôle pour gérer la syntaxe SQL de PostgreSQL  et les fonctions de PostGIS. Le fichier pgsql.xml que vous trouverez dans le répertoire d’installation contient tous les éléments pour la gestion des mots clefs et du style affecté pour chaque liste de mots clefs, je vous laisse le plaisir de l’explorer et de le modifier selon vos désirs.

Le lien de téléchargement : Pg3DViewer nouvelle version

by Jérôme ROLLAND (noreply@blogger.com) at September 09, 2014 04:00 PM

September 02, 2014

Boston GIS (Regina Obe, Leo Hsu)

PostGIS sessions at Postgres Open 2014 Chicago not too late

If you haven't signed up for our Postgres Open 2014 PostGIS tutorials, there are still a few tickets available, but time and space is running out.

For those of you who've already signed up for our tutorials in September 17th, 2014, keep an eye on this page: http://www.postgis.us/pgopen2014. Not much to see there yet, but we'll be posting content there before the tutorials as well as the examples so you can flaunt your mastery of cutting and pasting.

by Regina Obe (nospam@example.com) at September 02, 2014 03:20 AM

Postgres OnLine Journal (Leo Hsu, Regina Obe)

PostGIS presentations at Postgres Open 2014 in Chicago

For those of you who will be attending either of our PostGIS sessions in Postgres Open 2014 in Chicago September 17th, we've put up a page where we'll be posting links to the data we'll be using as well as the examples so you can follow along and copy and paste. Page is http://www.postgis.us/pgopen2014.

We ask that you bookmark the page before you come and try to install PostGIS 2.1 if you haven't already. Not much to see there yet. For those who haven't signed up, it's not too late. For those who will be unable to make it, we'll try to post all the content from the tutorials on the above page.

by Leo Hsu and Regina Obe (nospam@example.com) at September 02, 2014 03:05 AM

August 20, 2014

shisaa.jp

Postgis and PostgreSQL in Action - Timezones

Preface

Recently, I was lucky to be part of an awesome project called the Breaking Boundaries Tour.

This project is about two brothers, Omar and Greg Colin, who take their Stella scooters to make a full round trip across the United States. And, while they are at it, try to raise funding for Surfer's Healing Folly Beach - an organization that does great work enhancing the lives of children with autism through surfing . To accommodate this trip, they wished to have a site where visitors could follow their trail live, as it happened. A marker would travel across the map, with them, 24/7.

Furthermore, they needed the ability to jump off their scooters, snap a few pictures, edit a video, write some side info and push it on the net, for whole the world to see. Immediately after they made their post, it had to appear on the exact spot they where at when snapping their moments of beauty.

To aid in the live tracking of their global position, they acquired a dedicated GPS tracking device which sends a latitude/longitude coordinate via a mobile data network every 5 minutes.

Now, this (short) post is not about how I build the entire application, but rather about how I used PostGIS and PostgreSQL for a rather peculiar matter: deducting timezone information.

For those who are interested though: the site is entirely build in Python using the Flask "micro framework" and, of course, PostgreSQL as the database.

Timezone information?

Yes. Time, dates, timezones: hairy worms in hairy cans which many developers hate to open, but have to sooner or later.

In the case of Breaking Boundaries Tour, we had one major occasion where we needed the correct timezone information: where did the post happen?

Where did it happen?

A feature we wanted to implement was one to help visitors get a better view of when a certain post was written. To be able to see when a post was written in your local timezone is much more convenient then seeing the post time in some foreign zone.

We are lazy and do not wish to count back- or forward to figure out when a post popped up in our frame of time.

The reasoning is simple, always calculate all the times involved back to simple UTC (GMT). Then figure out the clients timezone using JavaScript, apply the time difference and done!

Simple eh?

Correct, except for one small detail in the feature request, in what zone was the post actually made?

Well...damn.

While you heart might be at the right place while thinking: "Simple, just look at the locale of the machine (laptop, mobile phone, ...) that was used to post!", this information if just too fragile. Remember, the bothers are crossing the USA, riding through at least three major timezones. You can simply not expect all the devices involved when posting to always adjust their locale automatically depending on where they are.

We need a more robust solution. We need PostGIS.

But, how can a spatial database help us to figure out the timezone?

Well, thanks to the hard labor delivered to us by Eric Muller from efele.net, we have a complete and maintained shapefile of the entire world, containing polygons that represent the different timezones accompanied by the official timezone declarations.

This enables us to use the latitude and longitude information from the dedicated tracking device to pin point in which timezone they where while writing their post.

So let me take you on a short trip to show you how I used the above data in conjunction with PostGIS and PostgreSQL.

Getting the data

The first thing to do, obviously, is to download the shapefile data and load it in to our PostgreSQL database. Navigate to the Timezone World portion of the efele.net site and download the "tz_world" shapefile.

This will give you a zip which you can extract:

$ unzip tz_world.zip

Unzipping will create a directory called "world" in which you can find the needed shapefile package files.

Next you will need to make sure that your database is PostGIS ready. Connect to your desired database (let us call it bar) as a superuser:

$ psql -U postgres bar

And create the PostGIS extension:

CREATE EXTENSION postgis;

Now go back to your terminal and load the shapefile into your database using the original owner of the database (here called foo):

$ shp2pgsql -S -s 4326 -I tz_world | psql -U foo bar

As you might remember from the PostGIS series, this loads in the geometry from the shapefile using only simple geometry (not "MULTI..." types) with a SRID of 4326.

What have we got?

This will take a couple of seconds and will create one table and two indexes. If you describe your database (assuming you have not made any tables yourself):

public | geography_columns | view     | postgres
public | geometry_columns  | view     | postgres
public | raster_columns    | view     | postgres
public | raster_overviews  | view     | postgres
public | spatial_ref_sys   | table    | postgres
public | tz_world          | table    | foo
public | tz_world_gid_seq  | sequence | foo

You will see the standard PostGIS bookkeeping and you will find the tz_world table together with a gid sequence.

Let us describe the table:

\d tz_world

And get:

Column |          Type          |                       Modifiers                        
--------+------------------------+--------------------------------------------------------
gid    | integer                | not null default nextval('tz_world_gid_seq'::regclass)
tzid   | character varying(30)  | 
geom   | geometry(Polygon,4326) | 
Indexes:
    "tz_world_pkey" PRIMARY KEY, btree (gid)
    "tz_world_geom_gist" gist (geom)

So we have:

  • gid: an arbitrary id column
  • tzid: holding the standards compliant textual timezone identification
  • geom: holding polygons in SRID 4326.

Also notice we have two indexes made for us:

  • tz_world_pkey: a simple B-tree index on our gid
  • tz_world_geom_gist: a GiST index on our geometry

This is a rather nice set, would you not say?

Using the data

So how do we go about using this data?

As I have said above, we need to figure out in which polygon (timezone) a certain point resides.

Let us take an arbitrary point on the earth:

  • latitude: 35.362852
  • longitude: 140.196131

This is a spot in the Chiba prefecture, central Japan.

Using the Simple Features functions we have available in PostGIS, it is trivial to find out in which polygon a certain point resides:

SELECT tzid 
    FROM tz_world
    WHERE ST_Intersects(ST_GeomFromText('POINT(140.196131 35.362852)', 4326), geom);

And we get back:

    tzid    
------------
 Asia/Tokyo

Awesome!

In the above query I used the function ST_Intersects which checks if a given piece of geometry (our point) shares any space with another piece. If we would check the execute plan of this query:

EXPLAIN ANALYZE SELECT tzid 
    FROM tz_world
    WHERE ST_Intersects(ST_GeomFromText('POINT(140.196131 35.362852)', 4326), geom);

We get back:

                                                      QUERY PLAN                                                          
------------------------------------------------------------------------------------------------------------------------------
Index Scan using tz_world_geom_gist on tz_world  (cost=0.28..8.54 rows=1 width=15) (actual time=0.591..0.592 rows=1 loops=1)
    Index Cond: ('0101000020E61000006BD784B446866140E3A430EF71AE4140'::geometry && geom)
    Filter: _st_intersects('0101000020E61000006BD784B446866140E3A430EF71AE4140'::geometry, geom)
Total runtime: 0.617 ms

That is not bad at all, a runtime of little over 0.6 Milliseconds and it is using our GiST index.

But, if a lookup is using our GiST index, a small alarm bell should go off inside your head. Remember my last chapter on the PostGIS series? I kept on babbling about index usage and how geometry functions or operators can only use GiST indexes when they perform bounding box calculations.

The latter might pose a problem in our case, for bounding boxes are a very rough approximations of the actual geometry. This means that when we arrive near timezone borders, our calculations might just give us the wrong timezone.

So how can we fix this?

This time, we do not need to.

This is one of the few blessed functions that makes use of both an index and is very accurate.

The ST_Intersects first uses the index to perform bounding box calculations. This filters out the majority of available geometry. Then it performs a more expensive, but more accurate calculation (on a small subset) to check if the given point is really inside the returned matches.

We can thus simply use this function without any more magic...life is simple!

Implementation

Now it is fair to say that we do not wish to perform this calculation every time a user views a post, that would not be very efficient nor smart.

Rather, it is a good idea to generate this information at post time, and save it for later use.

The way I have setup to save this information is twofold:

  • I only save a UTC (GTM) generalized timestamp of when the post was made.
  • I made an extra column in my so-called "posts" table where I only save the string that represents the timezone (Asia/Tokyo in the above case).

This keeps the date/time information in the database naive of any timezone and makes for easier calculations to give the time in either the clients timezone or in the timezone the post was originally written. You simply have one "root" time which you can move around timezones.

On every insert of a new post I have created a trigger that fetches the timezone and inserts it into the designated column. You could also fetch the timezone and update the post record using Python, but opting for an in-database solution saves you a few extra, unneeded round trips and is most likely a lot faster.

Let us see how we could create such a trigger.

A trigger in PostgreSQL is an event you can set to fire when certain conditions are met. The event(s) that fire have to be encapsulated inside a PostgreSQL function. Let us thus first start by creating the function that will insert our timezone string.

Creating functions

In PostgreSQL you can write functions in either C, Procedural languages (PgSQL, Perl, Python) or plain SQL.

Creating functions with plain SQL is the most straightforward and most easy way. However, since we want to write a function that is to be used inside a trigger, we have even a better option. We could employ the power of the embedded PostgreSQL procedural language to easily access and manipulate our newly insert data.

First, let us see which query we would use to fetch the timezone and update our post record:

UPDATE posts
  SET tzid = timezone.tzid
  FROM (SELECT tzid
          FROM tz_world
          WHERE ST_Intersects(
            ST_SetSRID(
              ST_MakePoint(140.196131, 35.362852),
            4326),
          geom)) AS timezone
  WHERE pid = 1;

This query will fetch the timezone string using a subquery and then update the correct record (a post with "pid" 1 in this example).

How do we pour this into a function?

CREATE OR REPLACE FUNCTION set_timezone() RETURNS TRIGGER AS $$
BEGIN
    UPDATE posts
    SET tzid = timezone.tzid
    FROM (SELECT tzid 
            FROM tz_world
              WHERE ST_Intersects(
                ST_SetSRID(
                  ST_MakePoint(NEW.longitude, NEW.latitude),
                4326),
              geom)) AS timezone
    WHERE pid = NEW.pid;
    RETURN NEW;
END $$
LANGUAGE PLPGSQL IMMUTABLE;

First we use the syntax CREATE OR REPLACE FUNCTION to indicate we want to create (or replace) a custom function. Then we tell PostgreSQL that this function will return type TRIGGER.

You might notice that we do not give this function any arguments. The reasoning here is that this function is "special". Functions which are used as triggers magically get information about the inserted data available.

Inside the function you can see we access our latitude and longitude prefixed with NEW. These keywords, NEW and OLD, refer to the record after and before the trigger(s) happened. In our case we could have used both, since we do not alter the latitude or longitude data, we simply fill a column that is NULL by default. There are more keywords available (TG_NAME, TG_RELID, TG_NARGS, ...) which refer to properties of the trigger itself, but that is beyond today's scope.

The actual SQL statement is wrapped between double dollar signs ($$). This is called dollar quoting and is the preferred way to quote your SQL string (as opposed to using single quotes). The body of the function, which in our case is mostly the SQL statement, is surrounded with a BEGIN and END keyword.

A trigger function always needs a RETURN statement that is used to provide the data for the updated record. This too has to reside in the body of the function.

Near the end of our function we need to declare in which language this function was written, in our case PLPGSQL.

Finally, the IMMUTABLE keyword tells PostgreSQL that this function is rather "functional", meaning: if the inputs are the same, the output will also, always be the same. Using this caching keyword gives our famous PostgreSQL planner the ability to make decisions based on this knowledge.

Creating triggers

Now that we have this functionality wrapped into a tiny PLPGSQL function, we can go ahead and create the trigger.

First you have the event on which a trigger can execute, these are:

  • INSERT
  • UPDATE
  • DELETE
  • TRUNCATE

Next, for each event you can specify at what timing your trigger has to fire:

  • BEFORE
  • AFTER
  • INSTEAD OF

The last one is a special timing by which you can replace the default behavior of the mentioned events.

For our use case, we are interested in executing our function AFTER INSERT.

CREATE TRIGGER set_timezone
    AFTER INSERT ON posts
    FOR EACH ROW
    EXECUTE PROCEDURE set_timezone();

This will setup the trigger that fires after the insert of a new record.

Wrapping it up

Good, that all there is to it.

We use a query, wrapped in a function, triggered by an insert event to inject the official timezone string which is deducted by PostGIS's spatial abilities.

Now you can use this information to get the exact timezone of where the post was made and use this to present the surfing client both the post timezone time and their local time.

For the curious ones out there: I used the MomentJS library for the client side time parsing. This library offers a timezone extension which accepts these official timezone strings to calculate offsets. A lifesaver, so go check it out.

Also, be sure to follow the bros while they scooter across the States!

And as always...thanks for reading!

by Tim van der Linden at August 20, 2014 10:00 AM

August 08, 2014

Stephen Mather

KNN with FLANN and laspy, a starting place

FLANN is Fast Library for Approximate Nearest Neighbors, which is a purportedly wicked fast nearest neighbor library for comparing multi-dimensional points. I only say purportedly, as I haven’t verified, but I assume this to be quite true. I’d like to move some (all) of my KNN calculations outside the database.

I’d like to do the following with FLANN– take a LiDAR point cloud and change it into a LiDAR height-above-ground point cloud. What follows is my explorations so far.

In a previous series of posts, e.g. https://smathermather.wordpress.com/2014/07/14/lidar-and-pointcloud-extension-pt-6/

pointcloud6

I have been using the point cloud extension in PostGIS. I like the 2-D chipping, but I think I should segregate my data into height classes before sending it into the database. In this way, I can query my data by height class and by location efficiently, taking full advantage of the efficiencies of storing all those little points in chips, while also being able to query the data in any of the dimensions I need to in the future. Enter FLANN.

I haven’t gotten far. To use FLANN with LiDAR through Python, I’m also using laspy.  There’s a great tutorial here: http://laspy.readthedocs.org/en/latest/tut_part_1.html

laspy

I make one change to the tutorial section using FLANN. The code as written is:

import laspy
import pyflann as pf
import numpy as np

# Open a file in read mode:
inFile = laspy.file.File("./laspytest/data/simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.

neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

To make this example work with the current version of pyflann, we need to make sure we import all of pyflann (or at least nn), and also set flann = FLANN() as follows:

import laspy
import numpy as np
from pyflann import *

# Open a file in read mode:
inFile = laspy.file.File("simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.
flann = FLANN()
neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

Finally, a small note on installation of pyflann on Ubuntu. What I’m about to document is undoubtedly not the recommended way to get pyflann working. But it worked… .

Installation for FLANN on Ubuntu can be found here: http://www.pointclouds.org/downloads/linux.html
pcl

But this does not seem to install pyflann. That said, it installs all our dependencies + FLANN, so…

I cloned, compiled, and installed the FLANN repo: https://github.com/mariusmuja/flann

git clone git://github.com/mariusmuja/flann.git
cd flann
mkdir BUILD
cd BUILD
cmake ../.
make
sudo make install

This get’s pyflann where it needs to go, and voila! we can now do nearest neighbor searches within Python.

Next step, turn my LiDAR xyz point cloud into a xy-height point cloud, then dump in height-class by height class into PostgreSQL. Wish me luck!


by smathermather at August 08, 2014 02:15 PM

August 07, 2014

Stephen Mather

Drivetime analyses, pgRouting

Map of overlapping 5-minute drive times from park access points
We’ve got some quick and dirty pgRouting-based code up on github. I say quick and dirty because it directly references the table names in both of the functions. I hope to fix this in the future.

The objective with this code is to input a point, use a nearest neighbor search to find the nearest intersection, and from that calculate a drive time alpha shape.

First, the nearest neighbor search:

CREATE OR REPLACE FUNCTION pgr.nn_search (geom geometry) RETURNS
int8 AS $$

SELECT id FROM pgr.roads_500_vertices_pgr AS r
ORDER BY geom <#> r.the_geom
LIMIT 1;

$$ LANGUAGE SQL VOLATILE;

This function takes one argument– pass it a point geometry, and it will do a knn search for the nearest point. Since we are leveraging pgRouting, it simply returns the id of the nearest intersection point. We wrote this as a function in order to run it, e.g. on all the points in a table. As I stated earlier, it directly references a table name, which is a little hackerish, but we’ll patch that faux pax later.

Now that we have the ID of the point in question, we can do a driving distance calculation, wrap that up in an alpha shape, and return our polygon. Again, we write this as a function:

CREATE OR REPLACE FUNCTION pgr.alpha_shape (id integer, minutes integer) RETURNS
geometry AS $$

WITH alphashape AS(SELECT pgr_alphaShape('WITH
DD AS (
SELECT seq, id1 AS node, cost
FROM pgr_drivingDistance(''SELECT id, source, target, cost FROM pgr.roads_500'',' || id || ', ' || minutes || ', false, false)),
dd_points AS (
SELECT id_ AS id, x, y
FROM pgr.roads_500_vertices_pgr v, DD d
WHERE v.id = d.node)
SELECT * FROM dd_points')),

alphapoints AS (
SELECT ST_Makepoint((pgr_alphashape).x, (pgr_alphashape).y) FROM alphashape),

alphaline AS (
SELECT ST_MakeLine(ST_MakePoint) FROM alphapoints)

SELECT ST_MakePolygon(ST_AddPoint(ST_MakeLine, ST_StartPoint(ST_MakeLine))) AS the_geom FROM alphaline

$$ LANGUAGE SQL VOLATILE;

Finally, we’ll use these functions in conjunction with a set of park feature access points to map our our 5-minute drive time. Overlaps in 5-minute zones we’ve rendered as a brighter green in the image above.

CREATE TABLE pgr.alpha_test AS
WITH dest_ids AS (
SELECT pgr.nn_search(the_geom) AS id FROM pgr.dest_pts
)
SELECT pgr.alpha_shape(id::int, 5)::geometry, id FROM dest_ids;

Oh, and I should point out that for the driving distance calculation, we have pre-calculated the costs of the roads based on the speed limit, so cost is really travel time in minutes.


by smathermather at August 07, 2014 01:50 AM

August 01, 2014

Stephen Mather

Plugin-free QGIS TMS tiles via GDAL

Want to load your favorite tiles into QGIS? How about a plugin-free QGIS TMS tiles via GDAL:

http://www.3liz.com/blog/rldhont/index.php?post/2012/07/17/OpenStreetMap-Tiles-in-QGIS

Really awesome… .

Needs but one change: epsg:900913 should be epsg:3857 or QGIS (GDAL?) throws an error. Presumably you could also define epsg:900913 in some config file, but barring that create an XML file as follows, and load as a raster in QGIS:

<GDAL_WMS>
    <Service name="TMS">
        <ServerUrl>http://maps1.clemetparks.com/tilestache/tilestache.cgi/basemap/${z}/${x}/${y}.jpg</ServerUrl>
    </Service>
    <DataWindow>
        <UpperLeftX>-20037508.34</UpperLeftX>
        <UpperLeftY>20037508.34</UpperLeftY>
        <LowerRightX>20037508.34</LowerRightX>
        <LowerRightY>-20037508.34</LowerRightY>
        <TileLevel>18</TileLevel>
        <TileCountX>1</TileCountX>
        <TileCountY>1</TileCountY>
        <YOrigin>top</YOrigin>
    </DataWindow>
    <Projection>EPSG:3857</Projection>
    <BlockSizeX>256</BlockSizeX>
    <BlockSizeY>256</BlockSizeY>
    <BandsCount>3</BandsCount>
    <Cache />
</GDAL_WMS>

Now I can use my pretty tilestache maps _everywhere!_:
Image of coyote points overlayed on tilestache rendered hillshade map

Screenshot of map from postgis

edit: forgot the hattip for the lead on this, Thomas gratier: @ThomasG77


by smathermather at August 01, 2014 03:28 AM

Stephen Mather

Cleaning animal tracking data — throwing away extra points

Much the problem of the modern era– too much data, uneven data, and yet, should we keep it all?

Here’s the problem space: attach GPS collar to a coyote, send that data home, and you have a recipe for gleaning a lot of information about the movement of that animal across the landscape. In order to maximize the data collected while also maximizing the battery life of said device, sometimes (according to a preset scheme) you might sample every hour or every 15 minutes, and then switch back to once a day.

When you get the data back in, you get something like this:

Map of raw coyote data-- all points, showing all available collar GPS points

Already, we see some pretty interesting patterns. The first thing that I notice is clusters of activity. Are these clusters related to our uneven sampling, sometimes every 15 minutes, sometimes once or twice a day? Or are those really important clusters in the overall activity of the animal?

The next thing I notice is how perfectly the range of movement of the coyote is bounded by expressways to the west and north, and by the river to the east. Those seem to be pretty impermiable boundaries.

So, as does a man who only has a hammer, we clean the data with PostGIS. In this case, it’s an ideal task. First, metacode:

Task one: union / dissolve our points by day — for some days this will give us a single point, for other days a multi-point for the day.

Task two: find the centroid of our union. This will grab a centroid for each clump of points (a weighted spatial mean, if you will) and return that single point per day. There are some assumptions here that might not bear out under scrutiny, but it’s not a bad first approximation.

Now, code:

CREATE TABLE centroids AS
-- union / dissolve our points by day
WITH clusterp AS (
	SELECT ST_Union(geom) AS geom, gmt_date FROM nc_clipped_data_3734
		GROUP BY gmt_date
),
-- find the centroid of our union
centroids AS (
	SELECT ST_Centroid(geom) AS geom, gmt_date FROM clusterp
)
SELECT * FROM centroids;

Boom! One record per day:

Map showing much smaller home range, one record per day clustered around a single core zone

A different pattern now emerges. We can’t see the hard boundaries of the use area, but now the core portion of the home range can be seen.

Upcoming (guest) blog post on the use of these data in home range calculations in R. Stay tuned.

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under CC BY SA.


by smathermather at August 01, 2014 02:58 AM