Welcome to Planet PostGIS

December 19, 2014

Boundless Geo

OGR FDW FTW!

Merry Christmas, nerds! Have you ever wanted to connect your PostGIS database directly to an existing GIS file or database and read from it without importing the data. I have good news, repeat after me: OGR FDW FTW!

(How did this happen? Boundless provides its engineers with “innovation time” to pursue personal technical projects, and this year I chose to blow all my time in one go on this project. Like the idea of innovation time? Join us!)

OGR, is the vector subsystem of the GDAL open source format library. The OGR API lets applications read and write to many different formats (Shapefile, Oracle, SDE, MapInfo, PostGIS, FGDB, GML, etc) without having to import them first.

FDW, is the PostgreSQL API for implementing “foreign data wrappers”: virtual tables in the database that are actually connections to remote data files, repositories and servers. There are FDW implementations to connect to MySQL, Oracle, SQLite, and even flat text files!

FTW, is “for the win”! Because the OGR API is a simple table-oriented read-write library that is practically begging to be hooked up to the PostgreSQL FDW system and expose OGR data sources as tables in PostgreSQL.

Here’s how it works.

First, go to the source code repository, build and install the ogr_fdw extension.

Next, create the ogr_fdw extension and the postgis extension.

CREATE EXTENSION postgis;
CREATE EXTENSION ogr_fdw;

Now create a “server”. For a database FDW, the server would be an actual database server somewhere. For the OGR FDW, a server is an OGR connection string: it could be a database server, a directory full of files, or (as in this case) a web service:

CREATE SERVER opengeo
  FOREIGN DATA WRAPPER ogr_fdw
  OPTIONS (
    datasource 'WFS:http://demo.opengeo.org/geoserver/wfs',
    format 'WFS' );

Now create a “foreign table”. This will look just like a table, to the database, but accessing it will actually create an access to the remote server.

CREATE FOREIGN TABLE topp_states (
  fid integer,
  geom geometry,
  gml_id varchar,
  state_name varchar,
  state_fips varchar,
  sub_region varchar,
  state_abbr varchar,
  land_km real,
  water_km real )
  SERVER opengeo
  OPTIONS ( layer 'topp:states' );

Now, treat the table like you would any other PostGIS table, and ask it a question in SQL:

SELECT st_area(geom::geography) 
FROM topp_states 
WHERE state_name = 'Missouri';

And the answer comes back: 180863 sq km.

How does it work? The PostgreSQL query fires off an OGR query to the server (in this case, the OpenGeo Suite demo server) which pulls the table down, and it is then filtered and calculated upon locally in PostgreSQL.

Could it be better? Sure!

It could push SQL restriction clauses down to the OGR driver, reducing the quantity of data returned to the server. For big tables, this will be very important.

It could restrict the number of columns it returns to just the ones needed for the query. This will make things a little faster.

It could allow read/write access to the table, so that INSERT, UPDATE and DELETE queries can also be run. This opens up a whole world of interoperability possibilities: imagine your database being able to directly edit a File Geodatabase on the file system? Or directly edit an ArcSDE server in a workgroup?

The biggest limitation of the PostgreSQL FDW system is that it requires a table definition before it can work, so you require a priori knowledge of the table structure to set up your tables. Because this just creates busywork, I’ve also bundled a utility program with the ogr_fdw extension: ogr_fdw_info. The utility will read an OGR data source and layer and return the SQL you need to create an FDW server and table for reading that layer.

Enjoy wrapping your foreign tables, and enjoy the holidays!

The post OGR FDW FTW! appeared first on Boundless.

by Paul Ramsey at December 19, 2014 05:11 PM

PostGIS Development

Foreign Data Wrappers for PostGIS

The last couple weeks have seen two interesting updates in the world of PostgreSQL “foreign data wrappers” (FDW). Foreign data wrappers allow you to access remote data inside your database, exactly like other tables. PostgreSQL ships with two example implementations, one for accessing text files, and the other for accessing remote PostgreSQL servers.

The two updates of interest to PostGIS users are:

  • The Oracle FDW implementation was recently enhanced to support spatial columns, so an Oracle table with SDO_GEOMETRY columns can be exposed in PostgreSQL as a table with PostGIS geometry columns.
  • A new OGR FDW implementation was released that supports exposing any OGR data source as a table with PostGIS geometry columns.

Now you can access your PostGIS data without even going to the trouble of importing it first!

by Paul Ramsey at December 19, 2014 12:00 AM

December 18, 2014

Oslandia

New version of "Tempus", a framework for multimodal route planning


Oslandia and IFSTTAR are pleased to announce the release of a new version of their multimodal route planning framework: Tempus



Tempus is an Open Source C++/Python framework that allows to use, develop and test route planning algorithms with a particular focus on multimodal routing and multi-objective requests, where every possible transport modes are taken into consideration for a trip: private cars, public transports, shared vehicules, bikes, etc.



Tempus features :

  • a C++ core based on Boost::graph for efficient and generic multimodal graph manipulations
  • a (dynamically loadable) plugin architecture for routing algorithms
  • exposure of its API as a webservice through a FastCGI WPS server that is able to respond to simultaneous requests
  • a client for QGIS
  • a loader script that is able to load road network data from Navteq, TomTom and OpenStreetMap, public transport data from GTFS and point of interests from CSV or shapefiles
  • a Python request API

Tempus is distributed under the terms of the GNU LGPLv2+


Tempus is on GitHub, get it, fork it, PR it : https://github.com/ifsttar/Tempus


The focus of Tempus is both on delivering an engine able to run efficient state-of-the-art routing algorithms and on offering a complete development environment to designers of routing algorithms. A complete set of data from different sources can be integrated into a main (PostGIS) database thanks to the loader part. The modular architecture allows to develop algorithms as plugins for the core. Some mechanisms are dedicated to help during the development: user variables for plugins, possibility to output parts of the traversed graph, user metrics, etc.


This 1.2.0 release brings:

  • a routing algorithm able to deal with turning movement restrictions for each transport mode
  • a reverse graph adaptor that allows to see the graph reversed. Requests with an 'arrive before' constraint can then be processed.
  • a tool (in the QGIS plugin) to select a subset of the graph to work on
  • a support for speed profiles on roads
  • an implementation of the A* algorithm

As for previous releases, a documentation is available, including notes for compilation and installation.


A Windows installer is also available. It can be used to install Tempus on Windows, as well as (if necessary) QGIS 2.4 and PostGIS 2 !


Developments are still going on to include state-of-the-art algorithms, such as contraction hierarchies or Raptor.


Please let us know if you are interested in development, training or consulting around these pieces of technology. Do not hesitate to contact us at infos@oslandia.com .

by Hugo Mercier at December 18, 2014 09:00 AM

PostGIS Development

PostGIS 2.1.5 Released

The 2.1.5 release of PostGIS is now available.

The PostGIS development team is happy to release patch for PostGIS 2.1, the 2.1.5 release. As befits a patch release, the focus is on bugs, breakages, and performance issues

http://download.osgeo.org/postgis/source/postgis-2.1.5.tar.gz

Continue Reading by clicking title hyperlink ..

by Paul Ramsey at December 18, 2014 12:00 AM

December 12, 2014

Boston GIS (Regina Obe, Leo Hsu)

AvidGeo LocationTech Tour Boston 2014, Synopsis

This past Monday we gave an introductory workshop on PostGIS. Our slides are here: PostGIS tutorial. Overall the event was very educational. I did have a butterfly feeling through out since we were the last to speak.

Talk slides

Talks were good, but most of the topics were not new to me so harder to appreciate. The talk I thought was the best was given by Steve Gifford on Geospatial on Mobile. Two things I liked about this talk was it was a topic I didn't know much about (developing native 3D globe map apps for iOS and Android) and Steve was a very funny speaker. That kind of funniness that looks unplanned and natural. I wish it had a bit more Android content in it though. The full list of talks and associated slides below.

Workshops

The workshops I think were the best part of the event. I particularly enjoyed Andrew Hill's CartoDB talk (PostGIS without the pain but with all the fun) mostly because I haven't used CartoDb so first I've seen it in action. Being able to drag and drop a zip file from your local computer or a url of data from some site and have a table ready to query I thought was pretty cool. You could also schedule it to check for updates if it was a url to a shapefile zip or somethng. Of course being able to write raw PostGIS spatial queries and a map app in 5 minutes was pretty slick.

Ours came after, and unfortunately I think it was pretty dry and too technical for most GIS folks. Plus we hadn't told people to download the files before hand so next to impossible to follow along. We should have called it PostGIS with the pain and on a shoestring budget.

by Regina Obe (nospam@example.com) at December 12, 2014 10:11 PM

December 06, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

Oracle FDW 1.1.0 with SDO_Geometry PostGIS spatial support

Oracle FDW is a foreign data wrapper PostgreSQL extension that allows you to read and write to Oracle database tables from a PostgreSQL database. You can get it via the PGXN network or the main website http://laurenz.github.io/oracle_fdw/.

What is new about the latest 1.1.0 release is that there is now support for the Oracle SDO_GEOMETRY type that allows you to map the most common geometry types POINT, LINE, POLYGON, MULTIPOINT, MULTILINE and MULTIPOLYGON to PostGIS geometry type. Much of the spatial plumbing work was done by Vincent Mora of Oslandia. If we have any Windows Oracle users out there, yes there are binaries available for windows for PostgreSQL 9.1- 9.4 for both 32-bit and 64-bit. The FDW does have a dependency on the OCI.dll which I think comes shipped with Oracle products. Unfortunately, we are not Oracle users so can't kick the tires.

by Leo Hsu and Regina Obe (nospam@example.com) at December 06, 2014 05:33 PM

December 02, 2014

Paul Ramsey

The Tyranny of Environment

Most users of PostGIS are safely ensconsed in the world of Linux, and their build/deploy environments are pretty similar to the ones used by the developers, so any problems they might experience are quickly found and removed early in development.

Some users are on Windows, but they are our most numerous user base, so we at least test that platform preemptively before release and make sure it is as good as we can make it.

And then there's the rest. We've had a passel of FreeBSD bugs lately, and I've found myself doing Solaris builds for customers, and don't get me started on the poor buggers running AIX. One of the annoyances of trying to fix a problem for a "weird platform" user is just getting the platform setup and running in the first place.

So, having recently learned a bit about vagrant, and seeing that some of the "weird" platforms have boxes already, I thought I would whip off a couple vagrant configurations so it's easy in the future to throw up a Solaris or FreeBSD box, or even a quick Centos box for debugging purposes.

I've just been setting up my Solaris Vagrantfile and using my favourite Solaris crutch: the OpenCSW software repository. But as I use it, I'm not just adding the "things I need", I'm implicitly choosing an environment:

  • my libxml2 is from OpenCSV
  • so is my gcc, which is version 4, not version 3
  • so is my postgres

This is convenient for me, but what are the chances that it'll be the environment used by someone on Solaris having problems? They might be compiling against libraries from /usr/sfw/bin, or using the Solaris gcc-3 package, or any number of other variants. At the end of the day, when testing on such a Solaris environment, will I be testing against a real situation, or a fantasyland of my own making?

For platforms like Ubuntu (apt) or Red Hat (yum) or FreeBSD (port) where there is One True Way to get software, the difficulties are less, but even then there is no easy way to get a "standard environment", or to quickly replicate the combinations of versions a user might have run into that is causing problems (libjson is a current source of pain). DLL hell has never really gone away, it has just found new ways to express itself.

(I will send a psychic wedgie to anyone who says "docker", I'm not kidding.)

by Paul Ramsey (noreply@blogger.com) at December 02, 2014 09:22 PM

November 30, 2014

Archaeogeek (Jo Cook)

PostGIS Day 2014

On 20th November this year we held the first PostGIS “Show-and-Tell” Day at the HQ of the British Computing Society. This was the first “official” OSGeo:UK event since hosting FOSS4G 2013 in Nottingham last year- it took us that long to recover from the stress!

We deliberately chose to make the event extremely light and informal, with the minimum of organisation, and this seemed to work pretty well. We had a range of speakers doing both longer talks and lightning talks, and then a session at the end where we discussed useful tools and add-ons for working with PostGIS.

The only caveats we placed on speakers were that the talk should (obviously) include PostGIS or PostgreSQL in some way, and that overt sales pitches would be considered uncool. Luckily our speakers obliged! There was also some really good audience interaction.

Highlights for me included Foreign Data Wrappers, for getting at information stored in other databases or behind other apis, but through PostgreSQL, and Raster processing. I was particularly excited by the Twitter FDW, but it turns out that it doesn’t work with the latest version of the twitter api, and it doesn’t look like a particularly well-maintained piece of code. You can see links to all the talks, and notes from the discussion that we had at the end, at our OSGeo:UK GitHub repository for the event: http://osgeouk.github.io/pgday/.

It transpired that there were a number of other events globally, also celebrating #PostGISDay, and perhaps next year it would be good to coordinate these better.

From an OSGeo:UK perspective, we have plans to hold more of these light-touch events, on a range of subjects (ideas gratefully received). Personally I would like to have charged a nominal fee for attending the event, not to make a profit but so that we could make a small donation back to PostGIS but charging money makes the whole process more complicated! We were extremely lucky in this case to get the venue for free, courtesy of the BCS Location Information Group, and to get the food sponsored by Astun Technology.

All in all, a really fun event, both to organise and to attend- thanks to all the speakers and attendees, and here’s to another event quite soon.

November 30, 2014 08:00 AM

November 27, 2014

Boston GIS (Regina Obe, Leo Hsu)

osm2pgrouting for Windows

For loading OSM data in a format already ready for pgRouting queries, the two common tools I've seen used are osm2po and osm2pgrouting. Unfortunately osm2pgrouting has been for a long time a Unix only tool mostly because no one tried to compile it for windows or test it on windows to see if it works. So this means that windows users who wanted a quick road to pgRouting nirvana had to use osm2po. Turns out the osm2pgrouting code compiles fine on windows (at least under mingw-w64 chain) and seems to work fine, so we should start making it available for windows.

This is still experimental, but if you are a windows user (or have a pgRouting windows user friend) and want to kick the tires a bit, you'll find osm2pgrouting for windows (both 32-bit and 64-bit) binaries on the PostGIS windows download page in the Unreleased section.

I compiled osm2pgrouting (for pgRouting 2.0) against PostgreSQL 9.4 using the same mingw-w64 chain I use to build PostGIS and pgRouting for windows -- clumsily detailed in my gist shell scripts - . Though it depends on libpq, it seems to work fine if you use it against the PostgreSQL 9.3 libpq. So just need to copy the additional files (osm2pgrouting and expat) into your PostgreSQL bin folder and you are good to go loading data into a pgRouting enabled database. It is assumed you already have PostGIS 2.1+ and pgRouting 2.0 already installed which come bundled together as part of the PostGIS 2.1+ windows installers and also available on the PostGIS windows download page.

Which should you use: osm2pgrouting or osm2po?

In many ways osm2po is superior to osm2pgrouting and has had more time invested in it. It can handle larger files on smaller machines, can work with the OSM pbf format, and it supports both Unix like and windows platforms. That said osm2po is not absolutely superior to osm2pgrouting. osm2po is inferior in a couple of ways that are important to me.

  • It's not open source, it's freeware -- which means you can't kick the code around and learn from it or inspect it. It also means you can't change it and contribute back your changes.
  • The primary motive of osm2po is not as a loading tool for pgRouting. osm2po is a complete routing engine with webserver which doesn't even rely on PostgreSQL. That said, if the maintainer decides one day providing a pgRouting export is no longer beneficial to osm2po development path, or pgRouting devs need to tweak it a bit to work with newer pgRouting, it will be difficult.
  • It's written in Java. This is my own personal dislike. Not that anything is wrong with Java, it's just sometimes a bit of a nuisance to get the VM going on a server I happen to get stuck on that I can't install anything I want. osm2pgrouting is much lighter weight not requiring a JVM, though it does require PostgreSQL,PostGIS, and pgRouting -- which you should have already.
  • osm2po generates an intermediary sql file which you then have to load via psql. I prefer the direct load approach of osm2pgrouting over having a big-o sql file to contend with after.

That said osm2po is great in many ways, but alternatives are always nice to have. Only annoying thing with having alternatives is deciding which alternative is best for what you are doing.

by Regina Obe (nospam@example.com) at November 27, 2014 11:34 PM

November 21, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

PostGIS Day synopsis

Yesterday was PostGIS day or as some may call it, Post GIS day and a couple of interesting things happened this day:

  • PostgreSQL 9.4 RC1 came out.
  • There were parties and unconferences, many summarized on http://2014.postgisday.rocks
  • I managed to entertain myself with a Conway's game of life PostGIS raster Map algebra style and pondered how wonderful PostGIS would be if it could generate animated gifs with some sort of aggregate function; to which I was politely called crazy by some of my fellow PSC friends.

But what was greatest of all and took the cake were these pictures:


Continue reading "PostGIS Day synopsis"

by Leo Hsu and Regina Obe (nospam@example.com) at November 21, 2014 09:16 PM

November 20, 2014

Boundless Geo

Happy PostGIS Day!

PostGISYesterday was GIS Day, which means that today is PostGIS Day — get it? Post-GIS Day! To celebrate, we're giving a 50% discount on online PostGIS training through the end of the week! Visit our training catalog and use promo code "postgis2014" to take advantage of this offer.

A lot has happened since last year, when I joined Stephen Mather, Josh Berkus, and Bborie Park in a PostGIS all-star hangout with James Fee.

In case you missed them, here are some features from our blog and elsewhere that highlight what's possible with PostGIS:

An, as always, be sure to check out our workshops for a slew of PostGIS-related courses, including Introduction to PostGIS and our Spatial Database Tips & Tricks.

Interested in learning about or deploying PostGIS? Boundless provides support, training, and maintenance for installations of PostGIS. Contact us to learn more.

The post Happy PostGIS Day! appeared first on Boundless.

by Paul Ramsey at November 20, 2014 01:54 PM

Boston GIS (Regina Obe, Leo Hsu)

PostGIS Day Game of Life celebration

This year's PostGIS day, I decided to celebrate with a little Conway's Game of Life fun inspired by Anita Graser's recent blog series Experiments with Game of Life. The path I chose to simulate the Game of life is a little different from Anita's. This variant exercises PostGIS 2.1+ raster mapalgebra and PostgreSQL 9.1+ recursive queries. Although you can do this with PostGIS 2.0, the map algebra syntax I am using is only supported in PostGIS 2.1+. My main disappointment is that because PostGIS does not yet support direct generation of animated gifs I had to settle for a comic strip I built unioning frames of rasters instead of motion picture. Hopefully some day my PostGIS animated gif dream will come true.


Continue reading "PostGIS Day Game of Life celebration"

by Regina Obe (nospam@example.com) at November 20, 2014 07:04 AM

November 18, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

FOSS4GNA 2015 and PGDay San Francisco March 2015

PGDay 2015 San Francisco will be held March 10th 2015 in Hyatt, San Franciso Airport, Burlingame, CA (Just outside of San Francisco). This year PGDay will be hosted along-side Free and Open Source Geospatial North America (FOSS4GNA) conference 2015 which runs March 9th-12th and EclipseCon NA 2015. Speaker submissions for FOSS4GNA 2015 and EclipseCon NA 2015 will end this Monday November 17th, 2014.


Continue reading "FOSS4GNA 2015 and PGDay San Francisco March 2015"

by Leo Hsu and Regina Obe (nospam@example.com) at November 18, 2014 03:46 AM

November 03, 2014

Boston GIS (Regina Obe, Leo Hsu)

AvidGeo Conference: LocationTech Tour Boston, Dec 8 2014

We'll be giving a PostGIS Introduction tutorial at AvidGeo Conference: LocationTech Tour Boston which will be held 8 December 2014 from 8:30 AM to 4:30 PM at:

Hack / Reduce
275 Third St
Cambridge, MA, 02142

Tickets are $40 for admission.

In addition to our PostGIS Intro tutorial, there is a great line-up of talks from visionaries such as Adena Schutzberg, talking: State of the Geospatial Industry and Raj Singh on the topic of The NoSQL Geo Landscape. Also check out other tutorials on CartoDb and QGIS.

by Regina Obe (nospam@example.com) at November 03, 2014 06:40 AM

October 26, 2014

Stephen Mather

Airspace — A deep rabbit hole

In previous maps we looked at Class B, C, and D airspace. Let’s add in Class E0 and E5… (not yet in 3D):

(Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL)

Map showing Class B, C, D, E0, and E5 airspace

Map showing Class B, C, D, E0, and E5 airspace

Previous posts:

http://smathermather.wordpress.com/2014/10/25/airspace-is-complicated-and-so-i-abuse-postgis-once-again/

and

http://smathermather.wordpress.com/2014/10/25/airspace-is-complicated-and-so-i-abuse-postgis-once-again-reprise/


by smathermather at October 26, 2014 02:16 AM

October 25, 2014

Stephen Mather

Airspace is complicated — and so I abuse PostGIS once again — Reprise…

In the previous post: http://smathermather.wordpress.com/2014/10/25/airspace-is-complicated-and-so-i-abuse-postgis-once-again/ we explore the 3D shape and complexity of controlled airspace.

Now here’s the rest of the code. We’ll add our affine transformation ala Seth Fitsimmons:

    SELECT 
        ST_Affine(
            ST_Rotate(geom, -pi() / 2),
            -- isometric
            cos(pi() / 6), -cos(pi() / 6), 0,
            sin(pi() / 6), sin(pi() / 6), 1,
            0, 0, 0,
            0, 0, 0
        )
    AS geom

And integrate that into our original function:

-- Inputs are a geometry and an elevation to move the geometry to
CREATE OR REPLACE FUNCTION threed_iso(footprint geometry, elevation numeric)
  RETURNS geometry AS
$BODY$

-- Force 3D, then translate to the input elevation
WITH floor AS
(
    SELECT ST_Translate( ST_Force3DZ(footprint), 0, 0, elevation ) AS geom
),
-- Now make isometric (and begin Seth Code)
iso AS
(
    SELECT 
        ST_Affine(
            ST_Rotate(geom, -pi() / 2),
            -- isometric
            cos(pi() / 6), -cos(pi() / 6), 0,
            sin(pi() / 6), sin(pi() / 6), 1,
            0, 0, 0,
            0, 0, 0
        )

    AS geom
    FROM floor
)
-- We'll force it back to 3D so QGIS is happy
SELECT ST_Force2D(geom) FROM iso
;
$BODY$
  LANGUAGE sql VOLATILE
  COST 100;

And voila!

DROP TABLE IF EXISTS class_c_isoc;

CREATE TABLE class_c_isoc AS
	SELECT gid, airspace, name, lowalt, highalt, threed_iso(geom, lowalt::numeric * 5) AS geom
	FROM class_c_subset;

Let’s take a look at Washington, DC and surrounds, another nice complicated example:

3D Figure of DC controlled airspace

3D Figure of DC controlled airspace

And again with map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL:

3D Figure of DC controlled airspace

3D Figure of DC controlled airspace

 


by smathermather at October 25, 2014 05:17 PM

Stephen Mather

Airspace is complicated — and so I abuse PostGIS once again

Let’s ignore for a moment the drone hobbiest / enthusiast. What is the shape of airspace for airplanes and commercial and government unmanned aircraft flying under Certificates of Authorization, and how can we visualize it?

Thanks to Anita in the last post, we have the Class B,C,D,E Airspace Shape Files which helps us define the overall shape of controlled airspace.

Map of Detroit, Cleveland, and Pittsburgh Class B Airspace

Map of Detroit, Cleveland, and Pittsburgh Class B Airspace

But, these are 3D things. I want to visualize them thus. Let us put some constraints on the problem. Let’s do it all in PostGIS, that way we can see it in QGIS or on the web. Let’s not use full PostGIS 3D (i.e. the SFCGAL library), not because it isn’t awesome (it truly is) but because it can be hard to install at the moment (although see https://github.com/vpicavet/docker-pggis for an easy way with docker). Finally, true 3D with unconstrained viewing angles and 100% flexibility is… is… well it usually sucks. So, we’ll stick to isometric viewing (thanks to Seth Fitzsimmons of Stamen http://stamen.com/ for his PostGIS isometric code which will be released upon his word). (Update — all the code is there…):

-- Inputs are a geometry and an elevation to move the geometry to
CREATE OR REPLACE FUNCTION threed_iso(footprint geometry, elevation numeric)
  RETURNS geometry AS
$BODY$

-- Force 3D, then translate to the input elevation
WITH floor AS
(
    SELECT ST_Translate( ST_Force3DZ(footprint), 0, 0, elevation ) AS geom
),
-- Now make isometric (and begin Seth Code)
iso AS
(
    SELECT
        ST_Affine(
            ST_Rotate(geom, -pi() / 2),
            -- isometric
            cos(pi() / 6), -cos(pi() / 6), 0,
            sin(pi() / 6), sin(pi() / 6), 1,
            0, 0, 0,
            0, 0, 0
        )

    AS geom
    FROM floor
)
-- We'll force it back to 3D so QGIS is happy
SELECT ST_Force2D(geom) FROM iso
;
$BODY$
  LANGUAGE sql VOLATILE
  COST 100;

Ok, now let’s rock some geometries with this bad function:

DROP TABLE IF EXISTS class_c_isoc;

CREATE TABLE class_c_isoc AS
	SELECT gid, airspace, name, lowalt, highalt, threed_iso(geom, lowalt::numeric * 5) AS geom
	FROM class_c_subset;

And what do our controlled airspaces look like?

Isometric view of Cleveland controlled airspace

Isometric view of Cleveland controlled airspace

Kind of conical, in this case with some “wings” that serve as approaches. It makes sense, I guess. At the center, where the airport is, controlled airspace goes from the ground up. As we get further from the airport, a set of concentric rings starting at higher elevations provide a controlled space that allows for landing and taking off.

Image of Cleveland airspace with Stamen Toner map as backdrop

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.

There are more complicated ones, naturally. We need look no further than Detroit for additional complexity:

Visualization of Detroit controlled airspace

Visualization of Detroit controlled airspace

airspace_detroit_toner

No wonder every time I fly through Detroit, no matter where I’m going, I end up flying over Lake Erie.

If we want really complicated, we need look no further than Cincinnati:
airspace_cinci

What is going on there? Is that due to shape of the hills, or city boundaries?

Finally, what does airspace look like over Ohio, West Virginia, and Pennsylvania (etc.), overall?
airspace_is_complicated_regional

And while the following map isn’t quite right, here is a figure including many of the small airports sans controlled airspace:

View of all controlled and uncontrolled airspace across Ohio and neighbors.

View of all controlled and uncontrolled airspace across Ohio and neighbors.

May an aeronautical pox be upon you!
The above line was not intended in bad taste, but just an homage to the “red dot fever” of early neo-geography (neo-geography which I’m informed definitionally doesn’t exist). Only a few minutes ago, it dawned on me that the deleted phrase could be misinterpreted these days… .

On a related note, if you want an interesting analysis of ebola, read Zeynep Tufekci’s analysis.

(for the record, all heights are exagerated by 5x, for clarity of reading).

Also, in case it wasn’t obvious: Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.


by smathermather at October 25, 2014 06:13 AM

October 20, 2014

Oslandia

Oslandia releases "Cuardo", 3D GIS viewer


Oslandia announces today the preliminary result of one of its research and development project : Cuardo .



Cuardo is an OpenSource WebGL 3D data viewer, focusing on urban data analysis and visualization.



Cuardo features :

  • 3D terrain data model
  • Multi-textured terrain (satellite, thematic)
  • Draping of 2D vector data
  • Textured 3D meshes
  • Custom styling, with 3D effects (extrusion…)
  • Level of detail management
  • Classic 3D user navigation
  • WebServices integration ( WMS, WFS )
  • Identify tool with custom actions (popup, links…)
  • Focus on performances
  • High customization possibilities

The specificity of this tool is that it deals really well with 3D features as objects, including LOD management, whereas most 3D viewers cannot distinguish objects and make it hard for object picking. Being able to visualize a full city in your browser without any plugin to install, and interact with other client application, makes Cuardo a good option for application development requiring 3D visualization, which you can mix with other great web libraries.


Cuardo is on GitHub, get it, fork it, PR it : https://github.com/Oslandia/cuardo


This software component is part of our 3D software stack. This Cuardo release is a first public release, and it is still under heavy development, but already usable for interesting use cases, as you can see in the video above.


We actively work on a full-featured 3D software solution to focus on 3D geographical data management. Our aim is to provide a full set of components allowing to :

  • Import 3D data
  • transform 2D data into 3D data
  • Store high volumes of data
  • Perform 3D analysis
  • Combine 3D data with - elevation models - point clouds - Raster data - 2D vector data - alphanumeric data
  • Setup webservices for data diffusion, including 3D data
  • Visualize the data in a browser

The release of Cuardo web client is the missing piece for a vertical 3D GIS data management stack.


Oslandia has a strong expertise on 3D GIS, be it on the server side, or as stated by this anouncement, on the client side.


Oslandia is commited to improving this software solution, adapting it and making it fit your specific use cases. Do not hesitate to get in touch if you want some more information on how to take advantage of your 3D spatial data in the best way. We will be delighted to hear your use cases : infos+3d@oslandia.com .


We would like to thank the Grand Lyon metropolitan area, for distributing a lot of GIS data as opendata, including textured 3D models : http://www.smartdata.grandlyon.com

by Vincent Picavet at October 20, 2014 08:00 AM

October 09, 2014

Boston GIS (Regina Obe, Leo Hsu)

FOSS4G NA 2015 Call for Speakers now Open

FOSS4G NA 2015 (Free and Open Source Geospatial North America conference) will be running March 9th-12th 2015 In Burlingame, California (Hyatt, San Francisco Airport). Call for talks and workshops is now open and we've already got some submissions to review. Deadline for submission is November 17th 2014. Presenters including (workshop and those giving 35-minute standard talk) get to attend the 3-days general sessions for free. The conference consists of one day of workshops and three days of general sessions, with lightning talks on some or all of of the general session days.

Details about presentations here: CFP details. The FOSS4G NA 2015 is a collaborative event co-sponsored by OSGeo and LocationTech

FOSS4G NA 2015 will be co-hosted with EclipseCon North America 2015. Admission to FOSS4G NA 2015 gives you admission to EclipseCon NA 2015. So get 2 conferences for the price of one and go conference hopping.

For PostgreSQL and PostGIS folks there is a likely chance that there will be a PostgreSQL day at the same venue during the same period. So we might have a very special PostGIS / PostgreSQL day. I'll update as more details of that surface.

by Regina Obe (nospam@example.com) at October 09, 2014 03:02 AM

October 01, 2014

Boundless Geo

PostGIS Training: Creating Overlays

PostGIS training At Boundless, we recognize that software is only as good as the problems it can be used to solve. That's why we don't just develop the best open source spatial software, we also teach you how to use it with supporttraining, and workshops directly from the experts.

One question that comes up often during our PostGIS training is "how do I do an overlay?" The terminology can vary: sometimes they call the operation a "union" sometimes an "intersect". What they mean is, "can you turn a collection of overlapping polygons into a collection of non-overlapping polygons that retain information about the overlapping polygons that formed them?"

screenshot_67

So an overlapping set of three circles becomes a non-overlapping set of 7 polygons. screenshot_69

Calculating the overlapping parts of a pair of shapes is easy, using the ST_Intersection() function in PostGIS, but that only works for pairs, and doesn't capture the areas that have no overlaps at all. How can we handle multiple overlaps and get out a polygon set that covers 100% of the area of the input sets? By taking the polygon geometry apart into lines, and then building new polygons back up. Let's construct a synthetic example: first, generate a collection of random points, using a Gaussian distribution, so there's more overlap in the middle. The crazy math in the SQL below just converts the uniform random numbers from the random() function into normally distributed numbers.

CREATE TABLE pts AS
WITH rands AS (
  SELECT generate_series as id, random() AS u1, random() AS u2 FROM generate_series(1,100)
)
SELECT
  id,
  ST_SetSRID(ST_MakePoint(
    50 * sqrt(-2 * ln(u1)) * cos(2*pi()*u2),
    50 * sqrt(-2 * ln(u1)) * sin(2*pi()*u2)),4326) AS geom
FROM rands;

The result looks like this: screenshot_70 Now, we turn the points into circles, big enough to have overlaps.

CREATE TABLE circles AS
SELECT id, ST_Buffer(geom, 10) AS geom FROM pts;

Which looks like this: screenshot_71 Now it's time to take the polygons apart. In this case we'll take the exterior ring of the circles, using ST_ExteriorRing(). If we were dealing with complex polygons with holes, we'd have to use ST_DumpRings(). Once we have the rings, we want to make sure that everywhere rings cross the lines are broken, so that no lines cross, they only touch at their end points. We do that with the ST_Union() function.

CREATE TABLE boundaries AS
SELECT ST_Union(ST_ExteriorRing(geom)) AS geom
FROM circles;

What comes out is just lines, but with end points at every crossing. screenshot_72 Now that we have noded lines, we can collect them into a multi-linestring and feed them to ST_Polygonize() to generate polygons. The polygons come out as one big multi-polygon, so we'll use ST_Dump() to convert it into a table with one row per polygon.

CREATE SEQUENCE polyseq;
CREATE TABLE polys AS
SELECT nextval('polyseq') AS id, (ST_Dump(ST_Polygonize(geom))).geom AS geom
FROM boundaries;

Now we have a set of polygons with no overlaps, only one polygon per area. screenshot_73 So, how do we figure out how many overlaps contributed to each incoming polygon? We can join the centroids of the new small polygons with the set of original circles, and calculate how many circles contain each centroid point. screenshot_74 A spatial join will allow us to calculate the number of overlaps.

ALTER TABLE polys ADD COLUMN count INTEGER DEFAULT 0;
UPDATE POLYS set count = p.count
FROM (
  SELECT count(*) AS count, p.id AS id  
  FROM polys p 
  JOIN circles c 
  ON ST_Contains(c.geom, ST_PointOnSurface(p.geom)) 
  GROUP BY p.id
) AS p
WHERE p.id = polys.id;

That's it! Now we have a single coverage of the area, where each polygon knows how much overlap contributed to it. Ironically, when visualized using the coverage count as a variable in the color ramp, it looks a lot like the original image, which was created with a simple transparency effect. However, the point here is that we've created new data, in the count attribute of the new polygon layer. screenshot_75 The same decompose-and-rebuild-and-join-centroids trick can be used to overlay all kinds of features, and to carry over attributes from the original input data, achieving the classic "GIS overlay" workflow. Happy geometry mashing!

Want to learn more? Try our Introduction to PostGIS online training course!

The post PostGIS Training: Creating Overlays appeared first on Boundless.

by Paul Ramsey at October 01, 2014 03:33 PM

September 30, 2014

Boston GIS (Regina Obe, Leo Hsu)

Waiting for PostGIS 2.2 - ST_ClipByBox2D - Map dicing Redux in concert with ST_Tile

One of the new features coming in PostGIS 2.2 is ST_ClipByBox2D (thanks to Sandro Santilli's recent commits funded by CartoDB ). However to take advantage of it, you are going to need your PostGIS compiled with GEOS 3.5+ (very recent build) which has not been released yet. Windows folks, the PostGIS 2.2 9.3 and 9.4 experimental binaries are built with the latest GEOS 3.5 development branch, so you should be able to test this out with Winnie's experimental builds.

Since the dawn of PostGIS, PostGIS users have needed to mutilate their geometries in often painful and horrific ways. Why is ST_ClipByBox2D function useful, because its a much faster way of mutilating your geometries by a rectangular mold than using ST_Intersection. Why would you want to mutilate your geometries? There are many reasons, but I'll give you one: As your geometry approaches the area of your bounding box and as your bounding box size decreases, the more efficient your spatial index becomes. You can consider this article, Map dicing redux of the article I wrote (eons ago) - Map Dicing and other stuff which describes the same approach with much older technology. Though I will be using more or less the same dataset Massachusetts TOWNSSURVEY_POLYM (its newer so you can't really compare) and I tried to simplify my exercise a bit (not resorting to temp tables and such), my focus in this article will be to compare the speed between the new ST_ClipByBox2D approach and the old ST_Intersection approach. The spoiler for those who don't have the patience for this exercise is that using ST_ClipByBox2D at least on my sample data set on my puny Windows 7 64-bit desktop using PostgreSQL 9.4beta2 64-bit was about 4-5 times faster than using ST_Intersection. There was a downside to this speedup. With the ST_Intersection approach, I had no invalid polygons. In the case of ST_ClipByBox2D, I had one invalid polygon. So as noted in the docs, use with caution. We'd be very interested in hearing other people's experiences with it.

One other benefit that ST_ClipByBox2D has over ST_Intersection which I didn't test out is that although ST_Intersection doesn't work with invalid geometries, ST_ClipByBox2D can.

For these exercises, I'm going to also abuse the PostGIS raster function ST_Tile for geometry use, cause for some strange reason, we have no ST_Tile function for PostGIS geometry. For those who missed the improper use of ST_Tile for raster, refer to Waiting for PostGIS 2.1 - ST_Tile.


Continue reading "Waiting for PostGIS 2.2 - ST_ClipByBox2D - Map dicing Redux in concert with ST_Tile"

by Regina Obe (nospam@example.com) at September 30, 2014 02:55 PM

September 26, 2014

PostGIS Development

Getting distinct pixel values and pixel value counts of a raster

PostGIS raster has so so many functions and probably at least 10 ways of doing something some much much slower than others. Suppose you have a raster, or you have a raster area of interest — say elevation raster for example, and you want to know the distinct pixel values in the area. The temptation is to reach for ST_Value function in raster, but there is a much much more efficient function to use, and that is the ST_ValueCount function.

ST_ValueCount function is one of many statistical raster functions available with PostGIS 2.0+. It is a set returning function that returns 2 values for each row: a pixel value (value), and a count of pixels (count) in the raster that have that value. It also has variants that allow you to filter for certain pixel values.

This tip was prompted by the question on stackexchange How can I extract all distinct values from a PostGIS Raster?

Continue Reading by clicking title hyperlink ..

by Regina Obe at September 26, 2014 12:00 AM

September 25, 2014

Archaeogeek (Jo Cook)

New Portable GIS Releases

I’m pleased to announce not one, but two new releases of Portable GIS! The first, version 4.2, contains QGIS 2.4 and PostGIS 1.5 and will be the last release to include that version of PostGIS.

The second, version 5, contains QGIS 2.4 and PostGIS 2.1 and all future releases will be based on this. Get them here:

There are two important things about these two releases. The first is that the download size is almost twice what it was previously- this is out of my control and is due to the increased size of the QGIS installation. The second is that version 5.0 should be considered a beta release as it’s very much un-tested. If there are any problems let me know via the Portable GIS google group.

You might also be interested to see a short presentation I did about Portable GIS at GeoMob London in September: http://archaeogeek.github.io/geomob-portablegis/#/

September 25, 2014 08:49 AM

September 24, 2014

Oslandia

Back from FOSS4G 2014 - part 2



Apart from Oslandia's participation at FOSS4G 2014 in Portland ( See part 1), a lot of topics have been discussed during this week of conference.


Presentation's slides and video are being put online and you can replay the whole conference if you could not attend, but here is a really short summary of a few things we noticed during the five days.

  • Fiona and Rasterio are now mature modern GIS Python libraries on top of GDAL/OGR, and are continuously improved.
  • WebGL is all around the place. With new support from the various browser's editors, this is definitly the backend of choice for large data rendering on online maps. This is still the exploration phase though, and there is currently no definite solution for webgl mapping, as things are evolving really fast and use cases appear.
  • Use cases for Opensource GIS software continue to grow larger. It even goes to space with PlanetLabs satellite constellation !
  • OpenLayers 3 and Leaflet are the two client libraries of choice, with distinct use cases, but an excellent quality for both
  • Design is here, for the good sake of our eyes and user experience. This is the first time that a specific track is fully dedicated to maps design.
  • Vector strikes back ! With the evolution of browser's technologies, vector data is gaining momentum. Vector tiles are a natural evolution for this, and formats and standards start to emerge. More performance, less space, and a door open for new end-user features
  • Following the vector movement, symbology also evolves to become dynamic. What if those colors nicely blend from one to another, and the thickness of these lines are controlled by a function of the zoom level ? Better UX !
  • GIS involves large dataset, and opensource software are now ready to crunch these teràbytes of data. Be it for mosaïcking rasters for the whole world, machine learning to predict climate change, or distributed data analysis, data size limits continue to be pushed further.
  • Well-known OpenSource components continuously gain features : Mapserver, QGIS, GDAL/OGR, PostGIS improve from version to version
  • CartoDB raised 8 million USD and continues to grow
  • A large part of the Mapbox team was present and showed various technologies which are at the bleeding edge of online mapping
  • A lot of various large organization were present, and we can notice TriMet, US DoD, Amazon (sponsor), GitHub, ESRI, RedHat… Big players are in the place too

FOSS4G is technical and at the same time user-oriented. It gathers the opensource community and industrials as well as academics. This is the best way to foster an intense brainstorming and create new ideas, visions and technologies.


Do not hesitate to check the videos, and give us your favorites, twitting @oslandia or mail us on any of these subjects at infos@oslandia.com !

by Vincent Picavet at September 24, 2014 05:00 PM

September 23, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

FOSS4G and PGOpen 2014 presentations

At FOSS4G we gave two presentations. The videos from other presentations are FOSS4G 2014 album. I have to commend the organizers for gathering such a rich collection of videos. Most of the videos turned out very well and are also downloadable in MP4 in various resolutions. It really was a treat being able to watch all I missed. I think there are still some videos that will be uploaded soon. As mentioned lots of PostGIS/PostgreSQL talks (or at least people using PostGIS/PostgreSQL in GIS).


Continue reading "FOSS4G and PGOpen 2014 presentations"

by Leo Hsu and Regina Obe (nospam@example.com) at September 23, 2014 06:55 PM

Oslandia

Back from FOSS4G 2014 - part 1



FOSS4G , the main international OSGeo event, takes place once per year and gather all folks involved in opensource geospatial technologies at large.

This year, the conference took place in Portland, Oregon, and gathered around 1000 people, on very various topics ranging from deeply technical Mapserver pro tips, to burnout management, through opendata policies and map designs.

This year's edition was really good, as always for this conference. A lot of social events allowed direct talks to passionate people from all over the world, and the presentations were diverse and of high quality.

Oslandia at FOSS4G

Oslandia is a full player in the OpenSource GIS field, and we participated this year with various interventions.


First of all, we gave a workshop on our 3D stack, with new features and a now vertical stack of software, from storage to visualization.

This workshop features the use of this stack for managing 3D GIS data. Given an opendata dataset from the city of Lyon, France , we integrate the data into PostGIS, and use PostGIS 3D features to analyze and enrich this data. Then, Horao ( http://oslandia.github.io/horao/ ), a 3D destktop viewer coupled with QGIS, is used to do some 3D visualization.


In a second part of the workshop, we use the very recently released Cuardo web client ( http://www.cuardo.org ) to display the data in a standard browser, using WebGL.



The workshop is self-contained and you can follow it completely to discover these new exciting features in OpenSource 3D GIS. find it online : https://github.com/Oslandia/workshop-3d

Gimme some YeSQL !

Vincent Picavet also did a presentation "Gimme some YeSQL - and a GIS -", which focused on latest and upcoming PostgreSQL features. From updateable views to the new JSONB data type, this is a practical and technical presentation of the big improvements coming in PostgreSQL 9.4. The talk illustrates the use of these features in a GIS context.

Slides are online As well as the video (the title is wrong).



OpenSource software for surveying

Another presentation by Vincent was an insight on a project we achieved for GIS Apavil in Romania, and concerns a full GIS for water distribution network and wastewater networks. The presentation especially shows how we setup some specific opensource surveying tools, and the way we deal with offline, versioning and synchronization of data.

Slides are on GitHub and the video should be up soon.

3D GIS

Olivier Courtin showed a summary of the current state of the 3D stack we develop and its foreseen future. Slides and video are online, and there should be more information coming soon.


Slides : https://github.com/Oslandia/presentations/raw/master/foss4g_2014/gisgoes3d_olivier_courtin.pdf



Get in touch !

Oslandia will continue to make research and development on all of these topics. If you are interested in any of these subjects, do not hesitate to get in touch at infos@oslandia.com


A post to follow will give you some background information on what we heard at FOSS4G and where to hear more !

by Vincent Picavet at September 23, 2014 05:00 PM

September 22, 2014

Paul Ramsey

PostGIS Feature Frenzy

A specially extended feature frenzy for FOSS4G 2014 in Portland. Usually I only frenzy for 25 minutes at a time, but they gave me an hour long session!

PostGIS Feature Frenzy — Paul Ramsey from FOSS4G on Vimeo.

Thanks to the organizers for giving me the big room and big slot!

by Paul Ramsey (noreply@blogger.com) at September 22, 2014 11:30 PM

September 16, 2014

Paul Ramsey

PostGIS for Managers

At FOSS4G this year, I wanted to take a run at the decision process around open source with particular reference to the decision to adopt PostGIS: what do managers need to know before they can get comfortable with the idea of making the move.

The Manager's Guide to PostGIS — Paul Ramsey from FOSS4G on Vimeo.

by Paul Ramsey (noreply@blogger.com) at September 16, 2014 07:48 PM

September 11, 2014

Postgres OnLine Journal (Leo Hsu, Regina Obe)

FOSS4G 2014 televised live

If you weren't able to make it to FOSS4G 2014 this year, you can still experience the event Live. All the tracks are being televised live and its pretty good reception. https://2014.foss4g.org/live/. Lots of GIS users using PostGIS and PostgreSQL. People seem to love Node.JS too.

After hearing enough about Node.JS from all these people, and this guy (Bill Dollins), I decided to try this out for myself.

I created a node.js web application - which you can download from here: https://github.com/robe2/node_postgis_express . It's really a spin-off from my other viewers, but more raw. I borrowed the same ideas as Bill, but instead of having a native node Postgres driver, I went for the pure javascript one so its easier to install on all platforms. I also experimented with using base-64 encoding to embed raster output directly into the browser so I don't have to have that silly img src path reference thing to contend with.

by Leo Hsu and Regina Obe (nospam@example.com) at September 11, 2014 08:37 PM

September 10, 2014

A GeoSpatial World

Transformation de formats 3D (OBJ, PLY, STL, VTP) vers PostGIS WKT POLYHEDRALSURFACE


3D (OBJ,PLY,STL,VTP) vers WKT POLYHEDRALSURFACE

image

image_thumb175_thumb ntroduction

L’idée de l’outil que je vais vous présenter est venu de la lecture d’un message dans la liste PostGIS-User polyhedral surface import from common 3D formats , l’auteur du message recherche un outil permettant de traduire quelques formats 3D (OBJ, PLY, STL ..) vers le format WKB/WKT pour une géométrie de type POLYHDERAL SURFACE et je me suis dis alors pourquoi ne pas développer un tel outil, le viewer 3D déjà réalisé me permettant de valider la transformation.
J’ai développé cet outil en C# 2013 avec :
  • ActiViz .Net un wrapper pour VTK
  ActiViz .Net


vtk-logo_thumb2 VTK
  • ScintillaNET un puissant contrôle d’édition de textes pour Windows et un wrapper pour le composant Scintilla.
image ScintillaNET
Scintilla
J’ai ensuite recherché sur internet des fichiers exemple pour chaque format de données 3D que j’ai inclus dans le répertoire Data de l’installation de l’outil. Voici quelques liens ou vous trouverez des fichiers à télécharger :
  • OBJ
OBJ Files
  • PLY
PLY Files
  • STL
STL Files Sharing
 

image image
image image image

image_thumb174_thumb nstallation


image_thumb41 Cliquez sur le lien pg3DImport_setup.exe pour télécharger le programme d’installation, puis lancer l’exécution.
Cliquer sur le  bouton Suivant à chaque étape, puis sur le bouton Terminer.
image
image
image
image
image
image

image_thumb179_thumb tilisation


image Double cliquez sur l’icône créé par le programme d’installation pour lancer l’application.
image
image

Ouvrir et transformer un fichier 3D

image Cliquez sur cet icône pour choisir un fichier 3D à transformer.
Allez dans le répertoire d’installation de Pg3DImport puis le sous-répertoire DATA, puis sélectionnez le type de fichier 3D que vous souhaitez transformer OBJ Files par exemple puis sélectionnez le fichier cesna.obj et cliquez sur le bouton Ouvrir.

image
Vous pourrez alors visualiser le fichier dans la partie viewer.

image
image Cliquez sur cet icône pour obtenir la transformation vers le format POLYHEDRALSURFACE WKT.


image

Vous pouvez alors copier le résultat dans la partie éditeur pour l’insérer dans une table PostGIS (version 2.0 ou supérieur) ou bien visualiser le résultat dans l’application Pg3DViewer que vous avez certainement déjà installée.

image
image
Visualisation dans Pg3DViewer

image

Un autre exemple

Ouverture d’un fichier STL dans le répertoire DATA de l’installation.
image
Affichage dans Pg3DViewer.

image
Vous pouvez ouvrir et convertir les différents fichiers présents dans le répertoire DATA de l’installation, mais si vous avez des fichiers 3D, utilisez plutôt ceux-ci.

image_thumb100_thumb1 onclusion.

Nous sommes arrivés à la fin de ce billet, un pas de plus dans le nouveau monde 3D de PostGIS, n’hésitez pas à me faire part de vos idées d’amélioration.

by Jérôme ROLLAND (noreply@blogger.com) at September 10, 2014 09:38 AM

A GeoSpatial World

Transformation de formats 3D (OBJ, PLY, STL, VTP) vers PostGIS WKT POLYHEDRALSURFACE

 

3D (OBJ,PLY,STL,VTP) vers WKT POLYHEDRALSURFACE

image

image_thumb175_thumb ntroduction

L’idée de l’outil que je vais vous présenter est venu de la lecture d’un message dans la liste PostGIS-User polyhedral surface import from common 3D formats , l’auteur du message recherche un outil permettant de traduire quelques formats 3D (OBJ, PLY, STL ..) vers le format WKB/WKT pour une géométrie de type POLYHDERAL SURFACE et je me suis dis alors pourquoi ne pas développer un tel outil, le viewer 3D déjà réalisé me permettant de valider la transformation.

J’ai développé cet outil en C# 2013 avec :

  • ActiViz .Net un wrapper pour VTK
  ActiViz .Net


vtk-logo_thumb2 VTK
  • ScintillaNET un puissant contrôle d’édition de textes pour Windows et un wrapper pour le composant Scintilla.
image ScintillaNET
  Scintilla

J’ai ensuite recherché sur internet des fichiers exemple pour chaque format de données 3D que j’ai inclus dans le répertoire Data de l’installation de l’outil. Voici quelques liens ou vous trouverez des fichiers à télécharger :

  • OBJ

OBJ Files

  • PLY

PLY Files

  • STL
STL Files Sharing

 

image image  
image image image

 

image_thumb174_thumb nstallation

 

image_thumb41

Cliquez sur le lien pg3DImport_setup.exe pour télécharger le programme d’installation, puis lancer l’exécution.

Cliquer sur le  bouton Suivant à chaque étape, puis sur le bouton Terminer.

  image
  image
  image
  image
  image
  image

 

image_thumb179_thumb tilisation

 

image Double cliquez sur l’icône créé par le programme d’installation pour lancer l’application.
  image
  image

Ouvrir et transformer un fichier 3D

image Cliquez sur cet icône pour choisir un fichier 3D à transformer.
  Allez dans le répertoire d’installation de Pg3DImport puis le sous-répertoire DATA, puis sélectionnez le type de fichier 3D que vous souhaitez transformer OBJ Files par exemple puis sélectionnez le fichier cesna.obj et cliquez sur le bouton Ouvrir.

image
  Vous pourrez alors visualiser le fichier dans la partie viewer.

image
image Cliquez sur cet icône pour obtenir la transformation vers le format POLYHEDRALSURFACE WKT.


image

Vous pouvez alors copier le résultat dans la partie éditeur pour l’insérer dans une table PostGIS (version 2.0 ou supérieur) ou bien visualiser le résultat dans l’application Pg3DViewer que vous avez certainement déjà installée.

 
image
  image
  Visualisation dans Pg3DViewer

image

Un autre exemple

  Ouverture d’un fichier STL dans le répertoire DATA de l’installation.
image
  Affichage dans Pg3DViewer.

image

Vous pouvez ouvrir et convertir les différents fichiers présents dans le répertoire DATA de l’installation, mais si vous avez des fichiers 3D, utilisez plutôt ceux-ci.

 

image_thumb100_thumb1 onclusion.

Nous sommes arrivés à la fin de ce billet, un pas de plus dans le nouveau monde 3D de PostGIS, n’hésitez pas à me faire part de vos idées d’amélioration.

by Jérôme ROLLAND (noreply@blogger.com) at September 10, 2014 07:55 AM

PostGIS Development

PostGIS 2.1.4 Released

The 2.1.4 release of PostGIS is now available.

The PostGIS development team is happy to release patch for PostGIS 2.1, the 2.1.4 release. As befits a patch release, the focus is on bugs, breakages, and performance issues

http://download.osgeo.org/postgis/source/postgis-2.1.4.tar.gz

Continue Reading by clicking title hyperlink ..

by Regina Obe at September 10, 2014 12:00 AM

September 09, 2014

Safe Software

Free and Open Source: The Inside Scoop on FME and FOSS

“Open source tools play well with others.” Paul Ramsey, PostGIS founder, talks Open Source at the FME International User Conference 2014.

“Open source tools play well with others.” Paul Ramsey, PostGIS founder, talks Open Source at the FME International User Conference 2014.

As a software developer, coding something that’s already been designed, implemented, and optimized is a pretty gigantic waste of time.

Open source libraries help developers avoid re-inventing wheels. At this year’s FME UC, PostGIS founder Paul Ramsey took the stage to talk about the awesomeness that is Open Source. He explained how it’s important that developers use available knowledge—and FME devs understand this, adding value by building useful functionality on top of open source libraries.

Safe Software proudly supports the Free and Open Source Software (FOSS) community and regularly sponsors the North American FOSS4G conferences. FME uses many FOSS libraries, and when the devs here at Safe aren’t playing soccer or LEGO or a game of oversized Jenga, they’re working hard to make these libraries more accessible.

Our use of these libraries is a friendly two-way street. A two-way Yellow Brick Road, if you will. FOSS adds value to FME’s advanced data transformation and automation capabilities, while our contributions to these technologies have included funding, coding, testing, and bug fixes. Here are ten such libraries.

10 FOSS Libraries that FME Uses and Contributes to

This Mario-like map is made possible by the integration of Mapnik and FME.

This Mario-like map is made possible by the integration of Mapnik and FME.

1. Mapnik

Mapnik is a toolkit that renders raster and vector data into beautiful cartographic maps. FME’s Mapnik transformer lets users leverage Mapnik’s capabilities with any of the 325+ data formats FME supports. With Mapnik and FME, you can also create Web Map Tiling Services, make a map out of your face, map yer way to buried treasure arr matey!, and other such highly critical tasks. FME’s intuitive graphical interface makes it easy for any user to benefit from this complex library.

2. SpatiaLite

SpatiaLite extends SQLite into a simple Spatial DBMS. FME’s SpatiaLite Reader / Writer allows users to connect this format with hundreds of other formats, and leverage all of FME’s data transformation and automation capabilities. For example, you can instantly import SpatiaLite data into Google Maps Engine using our free Data Loader for Google Maps Engine.

FME reads and writes a variety of open source databases.

FME reads and writes a variety of open source databases.

3. PostGIS

PostGIS is a spatial database that adds support for geographic objects and location queries to PostgreSQL. FME’s PostGIS Reader / Writer lets users integrate hundreds of data sources with this format. Watch Paul Ramsey’s FME UC talk to see the evolution of PostGIS, from the ideas that inspired it to its widespread use today.

4. JTS and GEOS

JTS and GEOS (Geometry Engine – Open Source) provide spatial predicates and functions for processing 2D geometry. JTS is the Java library and GEOS is the C++ library. Some of FME’s powerful geometry transformations are based on GEOS/JTS algorithms.

5. OGR and GDAL (Geospatial Data Abstraction Library)

These OSGeo libraries support reading/writing over 50 raster formats (GDAL) and 20 vector formats (OGR). FME uses these libraries to add several readers and writers to its list of supported formats, which means these formats gain a wide range of enterprise transformation and automation capabilities.

Bonus: if you use a GDAL-based format in FME, you can feel comfortable knowing it was probably implemented by our in-house world Tetris champion.

6. FDO

FDO works with a variety of geospatial data. This OSGeo API brings additional formats to FME, for example SQLite. In addition, Safe offers the free FME FDO Provider to allow AutoCAD Map 3D users to read 9 additional data formats. Safe was one of the earliest adopters of this API (version 3.0) and is acknowledged as an official Product Using FDO.

7. Ingres

Ingres is an RDBMS that supports large commercial and government applications. Safe’s partnership with Actian resulted in an open-source FME reader/writer for Ingres tables. This means increased data accessibility for Ingres users, making it easy to integrate spatial and enterprise data.

FME Workbench and the FME Data Inspector are available on Linux.

FME Workbench and the FME Data Inspector are available on Linux.

8. Qt

Qt is an application framework that we’ve used to help make FME fully cross-platform. Since FME 2013, FME has been available in beta on Mac and Linux platforms.

9. Interlis

Interlis is a geodata modeling and integration language adopted by the Swiss government as their national standard. Eisenhut Informatik is one of Safe’s longest standing open source partners, and their ili2fme plug-in allows Interlis data to be translated to/from any of the formats supported by FME.

10. LASzip

LASzip is a specialized library by Martin Isenburg that compresses LiDAR data without information loss—turning chubby LAS point clouds into lean, manageable LAZ files. The incorporation of this technology into FME has resulted a great improvement in the ability to archive and exchange point cloud data.

But Wait, There’s More

For the full list of libraries licensed under the LGPL, visit our FOSS Code Center. You can also view more information about FME’s open source libraries by opening FME Desktop and selecting About > Legal Notices.

FME for Free

Data Loader for Google Maps Engine

The Data Loader for Google Maps Engine is one free tool we offer for converting data with FME technology.

FME is FOSS! Sort of. We do offer some free data translation tools.

In addition to the FME FDO Provider mentioned above, we also offer the Data Loader for Google Maps Engine, the CityGML Importer for InfraWorks, and the FME Revit Exporter Plug-in.

The Easy Translator is also available as a free web service, for immediately translating data into your required format and coordinate system.

Moving forward, we aim to make more free tools and web services like these.

banner_fmerocks

How have you integrated your software or project with Free and Open Source Software? Are you attending FOSS4G this year, or have you in the past? Let us know in the comments!

The post Free and Open Source: The Inside Scoop on FME and FOSS appeared first on Safe Software Blog.

by Tiana Warner at September 09, 2014 06:19 PM

A GeoSpatial World

Pg3DViewer nouvelle version

 

Pg3DViewer nouvelle version

image

J’ai réalisé une nouvelle version de l’outil Pg3DViewer qui intègre ScintillaNET un puissant contrôle d’édition de textes avec coloration syntaxique pour Windows et un wrapper pour le composant Scintilla. J’ai utilisé ce contrôle pour gérer la syntaxe SQL de PostgreSQL  et les fonctions de PostGIS. Le fichier pgsql.xml que vous trouverez dans le répertoire d’installation contient tous les éléments pour la gestion des mots clefs et du style affecté pour chaque liste de mots clefs, je vous laisse le plaisir de l’explorer et de le modifier selon vos désirs.

Le lien de téléchargement : Pg3DViewer nouvelle version

by Jérôme ROLLAND (noreply@blogger.com) at September 09, 2014 04:00 PM

September 02, 2014

Boston GIS (Regina Obe, Leo Hsu)

PostGIS sessions at Postgres Open 2014 Chicago not too late

If you haven't signed up for our Postgres Open 2014 PostGIS tutorials, there are still a few tickets available, but time and space is running out.

For those of you who've already signed up for our tutorials in September 17th, 2014, keep an eye on this page: http://www.postgis.us/pgopen2014. Not much to see there yet, but we'll be posting content there before the tutorials as well as the examples so you can flaunt your mastery of cutting and pasting.

by Regina Obe (nospam@example.com) at September 02, 2014 03:20 AM

Postgres OnLine Journal (Leo Hsu, Regina Obe)

PostGIS presentations at Postgres Open 2014 in Chicago

For those of you who will be attending either of our PostGIS sessions in Postgres Open 2014 in Chicago September 17th, we've put up a page where we'll be posting links to the data we'll be using as well as the examples so you can follow along and copy and paste. Page is http://www.postgis.us/pgopen2014.

We ask that you bookmark the page before you come and try to install PostGIS 2.1 if you haven't already. Not much to see there yet. For those who haven't signed up, it's not too late. For those who will be unable to make it, we'll try to post all the content from the tutorials on the above page.

by Leo Hsu and Regina Obe (nospam@example.com) at September 02, 2014 03:05 AM

August 20, 2014

shisaa.jp

Postgis and PostgreSQL in Action - Timezones

Preface

Recently, I was lucky to be part of an awesome project called the Breaking Boundaries Tour.

This project is about two brothers, Omar and Greg Colin, who take their Stella scooters to make a full round trip across the United States. And, while they are at it, try to raise funding for Surfer's Healing Folly Beach - an organization that does great work enhancing the lives of children with autism through surfing . To accommodate this trip, they wished to have a site where visitors could follow their trail live, as it happened. A marker would travel across the map, with them, 24/7.

Furthermore, they needed the ability to jump off their scooters, snap a few pictures, edit a video, write some side info and push it on the net, for whole the world to see. Immediately after they made their post, it had to appear on the exact spot they where at when snapping their moments of beauty.

To aid in the live tracking of their global position, they acquired a dedicated GPS tracking device which sends a latitude/longitude coordinate via a mobile data network every 5 minutes.

Now, this (short) post is not about how I build the entire application, but rather about how I used PostGIS and PostgreSQL for a rather peculiar matter: deducting timezone information.

For those who are interested though: the site is entirely build in Python using the Flask "micro framework" and, of course, PostgreSQL as the database.

Timezone information?

Yes. Time, dates, timezones: hairy worms in hairy cans which many developers hate to open, but have to sooner or later.

In the case of Breaking Boundaries Tour, we had one major occasion where we needed the correct timezone information: where did the post happen?

Where did it happen?

A feature we wanted to implement was one to help visitors get a better view of when a certain post was written. To be able to see when a post was written in your local timezone is much more convenient then seeing the post time in some foreign zone.

We are lazy and do not wish to count back- or forward to figure out when a post popped up in our frame of time.

The reasoning is simple, always calculate all the times involved back to simple UTC (GMT). Then figure out the clients timezone using JavaScript, apply the time difference and done!

Simple eh?

Correct, except for one small detail in the feature request, in what zone was the post actually made?

Well...damn.

While you heart might be at the right place while thinking: "Simple, just look at the locale of the machine (laptop, mobile phone, ...) that was used to post!", this information if just too fragile. Remember, the bothers are crossing the USA, riding through at least three major timezones. You can simply not expect all the devices involved when posting to always adjust their locale automatically depending on where they are.

We need a more robust solution. We need PostGIS.

But, how can a spatial database help us to figure out the timezone?

Well, thanks to the hard labor delivered to us by Eric Muller from efele.net, we have a complete and maintained shapefile of the entire world, containing polygons that represent the different timezones accompanied by the official timezone declarations.

This enables us to use the latitude and longitude information from the dedicated tracking device to pin point in which timezone they where while writing their post.

So let me take you on a short trip to show you how I used the above data in conjunction with PostGIS and PostgreSQL.

Getting the data

The first thing to do, obviously, is to download the shapefile data and load it in to our PostgreSQL database. Navigate to the Timezone World portion of the efele.net site and download the "tz_world" shapefile.

This will give you a zip which you can extract:

$ unzip tz_world.zip

Unzipping will create a directory called "world" in which you can find the needed shapefile package files.

Next you will need to make sure that your database is PostGIS ready. Connect to your desired database (let us call it bar) as a superuser:

$ psql -U postgres bar

And create the PostGIS extension:

CREATE EXTENSION postgis;

Now go back to your terminal and load the shapefile into your database using the original owner of the database (here called foo):

$ shp2pgsql -S -s 4326 -I tz_world | psql -U foo bar

As you might remember from the PostGIS series, this loads in the geometry from the shapefile using only simple geometry (not "MULTI..." types) with a SRID of 4326.

What have we got?

This will take a couple of seconds and will create one table and two indexes. If you describe your database (assuming you have not made any tables yourself):

public | geography_columns | view     | postgres
public | geometry_columns  | view     | postgres
public | raster_columns    | view     | postgres
public | raster_overviews  | view     | postgres
public | spatial_ref_sys   | table    | postgres
public | tz_world          | table    | foo
public | tz_world_gid_seq  | sequence | foo

You will see the standard PostGIS bookkeeping and you will find the tz_world table together with a gid sequence.

Let us describe the table:

\d tz_world

And get:

Column |          Type          |                       Modifiers                        
--------+------------------------+--------------------------------------------------------
gid    | integer                | not null default nextval('tz_world_gid_seq'::regclass)
tzid   | character varying(30)  | 
geom   | geometry(Polygon,4326) | 
Indexes:
    "tz_world_pkey" PRIMARY KEY, btree (gid)
    "tz_world_geom_gist" gist (geom)

So we have:

  • gid: an arbitrary id column
  • tzid: holding the standards compliant textual timezone identification
  • geom: holding polygons in SRID 4326.

Also notice we have two indexes made for us:

  • tz_world_pkey: a simple B-tree index on our gid
  • tz_world_geom_gist: a GiST index on our geometry

This is a rather nice set, would you not say?

Using the data

So how do we go about using this data?

As I have said above, we need to figure out in which polygon (timezone) a certain point resides.

Let us take an arbitrary point on the earth:

  • latitude: 35.362852
  • longitude: 140.196131

This is a spot in the Chiba prefecture, central Japan.

Using the Simple Features functions we have available in PostGIS, it is trivial to find out in which polygon a certain point resides:

SELECT tzid 
    FROM tz_world
    WHERE ST_Intersects(ST_GeomFromText('POINT(140.196131 35.362852)', 4326), geom);

And we get back:

    tzid    
------------
 Asia/Tokyo

Awesome!

In the above query I used the function ST_Intersects which checks if a given piece of geometry (our point) shares any space with another piece. If we would check the execute plan of this query:

EXPLAIN ANALYZE SELECT tzid 
    FROM tz_world
    WHERE ST_Intersects(ST_GeomFromText('POINT(140.196131 35.362852)', 4326), geom);

We get back:

                                                      QUERY PLAN                                                          
------------------------------------------------------------------------------------------------------------------------------
Index Scan using tz_world_geom_gist on tz_world  (cost=0.28..8.54 rows=1 width=15) (actual time=0.591..0.592 rows=1 loops=1)
    Index Cond: ('0101000020E61000006BD784B446866140E3A430EF71AE4140'::geometry && geom)
    Filter: _st_intersects('0101000020E61000006BD784B446866140E3A430EF71AE4140'::geometry, geom)
Total runtime: 0.617 ms

That is not bad at all, a runtime of little over 0.6 Milliseconds and it is using our GiST index.

But, if a lookup is using our GiST index, a small alarm bell should go off inside your head. Remember my last chapter on the PostGIS series? I kept on babbling about index usage and how geometry functions or operators can only use GiST indexes when they perform bounding box calculations.

The latter might pose a problem in our case, for bounding boxes are a very rough approximations of the actual geometry. This means that when we arrive near timezone borders, our calculations might just give us the wrong timezone.

So how can we fix this?

This time, we do not need to.

This is one of the few blessed functions that makes use of both an index and is very accurate.

The ST_Intersects first uses the index to perform bounding box calculations. This filters out the majority of available geometry. Then it performs a more expensive, but more accurate calculation (on a small subset) to check if the given point is really inside the returned matches.

We can thus simply use this function without any more magic...life is simple!

Implementation

Now it is fair to say that we do not wish to perform this calculation every time a user views a post, that would not be very efficient nor smart.

Rather, it is a good idea to generate this information at post time, and save it for later use.

The way I have setup to save this information is twofold:

  • I only save a UTC (GTM) generalized timestamp of when the post was made.
  • I made an extra column in my so-called "posts" table where I only save the string that represents the timezone (Asia/Tokyo in the above case).

This keeps the date/time information in the database naive of any timezone and makes for easier calculations to give the time in either the clients timezone or in the timezone the post was originally written. You simply have one "root" time which you can move around timezones.

On every insert of a new post I have created a trigger that fetches the timezone and inserts it into the designated column. You could also fetch the timezone and update the post record using Python, but opting for an in-database solution saves you a few extra, unneeded round trips and is most likely a lot faster.

Let us see how we could create such a trigger.

A trigger in PostgreSQL is an event you can set to fire when certain conditions are met. The event(s) that fire have to be encapsulated inside a PostgreSQL function. Let us thus first start by creating the function that will insert our timezone string.

Creating functions

In PostgreSQL you can write functions in either C, Procedural languages (PgSQL, Perl, Python) or plain SQL.

Creating functions with plain SQL is the most straightforward and most easy way. However, since we want to write a function that is to be used inside a trigger, we have even a better option. We could employ the power of the embedded PostgreSQL procedural language to easily access and manipulate our newly insert data.

First, let us see which query we would use to fetch the timezone and update our post record:

UPDATE posts
  SET tzid = timezone.tzid
  FROM (SELECT tzid
          FROM tz_world
          WHERE ST_Intersects(
            ST_SetSRID(
              ST_MakePoint(140.196131, 35.362852),
            4326),
          geom)) AS timezone
  WHERE pid = 1;

This query will fetch the timezone string using a subquery and then update the correct record (a post with "pid" 1 in this example).

How do we pour this into a function?

CREATE OR REPLACE FUNCTION set_timezone() RETURNS TRIGGER AS $$
BEGIN
    UPDATE posts
    SET tzid = timezone.tzid
    FROM (SELECT tzid 
            FROM tz_world
              WHERE ST_Intersects(
                ST_SetSRID(
                  ST_MakePoint(NEW.longitude, NEW.latitude),
                4326),
              geom)) AS timezone
    WHERE pid = NEW.pid;
    RETURN NEW;
END $$
LANGUAGE PLPGSQL IMMUTABLE;

First we use the syntax CREATE OR REPLACE FUNCTION to indicate we want to create (or replace) a custom function. Then we tell PostgreSQL that this function will return type TRIGGER.

You might notice that we do not give this function any arguments. The reasoning here is that this function is "special". Functions which are used as triggers magically get information about the inserted data available.

Inside the function you can see we access our latitude and longitude prefixed with NEW. These keywords, NEW and OLD, refer to the record after and before the trigger(s) happened. In our case we could have used both, since we do not alter the latitude or longitude data, we simply fill a column that is NULL by default. There are more keywords available (TG_NAME, TG_RELID, TG_NARGS, ...) which refer to properties of the trigger itself, but that is beyond today's scope.

The actual SQL statement is wrapped between double dollar signs ($$). This is called dollar quoting and is the preferred way to quote your SQL string (as opposed to using single quotes). The body of the function, which in our case is mostly the SQL statement, is surrounded with a BEGIN and END keyword.

A trigger function always needs a RETURN statement that is used to provide the data for the updated record. This too has to reside in the body of the function.

Near the end of our function we need to declare in which language this function was written, in our case PLPGSQL.

Finally, the IMMUTABLE keyword tells PostgreSQL that this function is rather "functional", meaning: if the inputs are the same, the output will also, always be the same. Using this caching keyword gives our famous PostgreSQL planner the ability to make decisions based on this knowledge.

Creating triggers

Now that we have this functionality wrapped into a tiny PLPGSQL function, we can go ahead and create the trigger.

First you have the event on which a trigger can execute, these are:

  • INSERT
  • UPDATE
  • DELETE
  • TRUNCATE

Next, for each event you can specify at what timing your trigger has to fire:

  • BEFORE
  • AFTER
  • INSTEAD OF

The last one is a special timing by which you can replace the default behavior of the mentioned events.

For our use case, we are interested in executing our function AFTER INSERT.

CREATE TRIGGER set_timezone
    AFTER INSERT ON posts
    FOR EACH ROW
    EXECUTE PROCEDURE set_timezone();

This will setup the trigger that fires after the insert of a new record.

Wrapping it up

Good, that all there is to it.

We use a query, wrapped in a function, triggered by an insert event to inject the official timezone string which is deducted by PostGIS's spatial abilities.

Now you can use this information to get the exact timezone of where the post was made and use this to present the surfing client both the post timezone time and their local time.

For the curious ones out there: I used the MomentJS library for the client side time parsing. This library offers a timezone extension which accepts these official timezone strings to calculate offsets. A lifesaver, so go check it out.

Also, be sure to follow the bros while they scooter across the States!

And as always...thanks for reading!

by Tim van der Linden at August 20, 2014 10:00 AM

August 08, 2014

Stephen Mather

KNN with FLANN and laspy, a starting place

FLANN is Fast Library for Approximate Nearest Neighbors, which is a purportedly wicked fast nearest neighbor library for comparing multi-dimensional points. I only say purportedly, as I haven’t verified, but I assume this to be quite true. I’d like to move some (all) of my KNN calculations outside the database.

I’d like to do the following with FLANN– take a LiDAR point cloud and change it into a LiDAR height-above-ground point cloud. What follows is my explorations so far.

In a previous series of posts, e.g. http://smathermather.wordpress.com/2014/07/14/lidar-and-pointcloud-extension-pt-6/

pointcloud6

I have been using the point cloud extension in PostGIS. I like the 2-D chipping, but I think I should segregate my data into height classes before sending it into the database. In this way, I can query my data by height class and by location efficiently, taking full advantage of the efficiencies of storing all those little points in chips, while also being able to query the data in any of the dimensions I need to in the future. Enter FLANN.

I haven’t gotten far. To use FLANN with LiDAR through Python, I’m also using laspy.  There’s a great tutorial here: http://laspy.readthedocs.org/en/latest/tut_part_1.html

laspy

I make one change to the tutorial section using FLANN. The code as written is:

import laspy
import pyflann as pf
import numpy as np

# Open a file in read mode:
inFile = laspy.file.File("./laspytest/data/simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.

neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

To make this example work with the current version of pyflann, we need to make sure we import all of pyflann (or at least nn), and also set flann = FLANN() as follows:

import laspy
import numpy as np
from pyflann import *

# Open a file in read mode:
inFile = laspy.file.File("simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.
flann = FLANN()
neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

Finally, a small note on installation of pyflann on Ubuntu. What I’m about to document is undoubtedly not the recommended way to get pyflann working. But it worked… .

Installation for FLANN on Ubuntu can be found here: http://www.pointclouds.org/downloads/linux.html
pcl

But this does not seem to install pyflann. That said, it installs all our dependencies + FLANN, so…

I cloned, compiled, and installed the FLANN repo: https://github.com/mariusmuja/flann

git clone git://github.com/mariusmuja/flann.git
cd flann
mkdir BUILD
cd BUILD
cmake ../.
make
sudo make install

This get’s pyflann where it needs to go, and voila! we can now do nearest neighbor searches within Python.

Next step, turn my LiDAR xyz point cloud into a xy-height point cloud, then dump in height-class by height class into PostgreSQL. Wish me luck!


by smathermather at August 08, 2014 02:15 PM

August 07, 2014

Stephen Mather

Drivetime analyses, pgRouting

Map of overlapping 5-minute drive times from park access points
We’ve got some quick and dirty pgRouting-based code up on github. I say quick and dirty because it directly references the table names in both of the functions. I hope to fix this in the future.

The objective with this code is to input a point, use a nearest neighbor search to find the nearest intersection, and from that calculate a drive time alpha shape.

First, the nearest neighbor search:

CREATE OR REPLACE FUNCTION pgr.nn_search (geom geometry) RETURNS
int8 AS $$

SELECT id FROM pgr.roads_500_vertices_pgr AS r
ORDER BY geom <#> r.the_geom
LIMIT 1;

$$ LANGUAGE SQL VOLATILE;

This function takes one argument– pass it a point geometry, and it will do a knn search for the nearest point. Since we are leveraging pgRouting, it simply returns the id of the nearest intersection point. We wrote this as a function in order to run it, e.g. on all the points in a table. As I stated earlier, it directly references a table name, which is a little hackerish, but we’ll patch that faux pax later.

Now that we have the ID of the point in question, we can do a driving distance calculation, wrap that up in an alpha shape, and return our polygon. Again, we write this as a function:

CREATE OR REPLACE FUNCTION pgr.alpha_shape (id integer, minutes integer) RETURNS
geometry AS $$

WITH alphashape AS(SELECT pgr_alphaShape('WITH
DD AS (
SELECT seq, id1 AS node, cost
FROM pgr_drivingDistance(''SELECT id, source, target, cost FROM pgr.roads_500'',' || id || ', ' || minutes || ', false, false)),
dd_points AS (
SELECT id_ AS id, x, y
FROM pgr.roads_500_vertices_pgr v, DD d
WHERE v.id = d.node)
SELECT * FROM dd_points')),

alphapoints AS (
SELECT ST_Makepoint((pgr_alphashape).x, (pgr_alphashape).y) FROM alphashape),

alphaline AS (
SELECT ST_MakeLine(ST_MakePoint) FROM alphapoints)

SELECT ST_MakePolygon(ST_AddPoint(ST_MakeLine, ST_StartPoint(ST_MakeLine))) AS the_geom FROM alphaline

$$ LANGUAGE SQL VOLATILE;

Finally, we’ll use these functions in conjunction with a set of park feature access points to map our our 5-minute drive time. Overlaps in 5-minute zones we’ve rendered as a brighter green in the image above.

CREATE TABLE pgr.alpha_test AS
WITH dest_ids AS (
SELECT pgr.nn_search(the_geom) AS id FROM pgr.dest_pts
)
SELECT pgr.alpha_shape(id::int, 5)::geometry, id FROM dest_ids;

Oh, and I should point out that for the driving distance calculation, we have pre-calculated the costs of the roads based on the speed limit, so cost is really travel time in minutes.


by smathermather at August 07, 2014 01:50 AM

August 01, 2014

Stephen Mather

Plugin-free QGIS TMS tiles via GDAL

Want to load your favorite tiles into QGIS? How about a plugin-free QGIS TMS tiles via GDAL:

http://www.3liz.com/blog/rldhont/index.php?post/2012/07/17/OpenStreetMap-Tiles-in-QGIS

Really awesome… .

Needs but one change: epsg:900913 should be epsg:3857 or QGIS (GDAL?) throws an error. Presumably you could also define epsg:900913 in some config file, but barring that create an XML file as follows, and load as a raster in QGIS:

<GDAL_WMS>
    <Service name="TMS">
        <ServerUrl>http://maps1.clemetparks.com/tilestache/tilestache.cgi/basemap/${z}/${x}/${y}.jpg</ServerUrl>
    </Service>
    <DataWindow>
        <UpperLeftX>-20037508.34</UpperLeftX>
        <UpperLeftY>20037508.34</UpperLeftY>
        <LowerRightX>20037508.34</LowerRightX>
        <LowerRightY>-20037508.34</LowerRightY>
        <TileLevel>18</TileLevel>
        <TileCountX>1</TileCountX>
        <TileCountY>1</TileCountY>
        <YOrigin>top</YOrigin>
    </DataWindow>
    <Projection>EPSG:3857</Projection>
    <BlockSizeX>256</BlockSizeX>
    <BlockSizeY>256</BlockSizeY>
    <BandsCount>3</BandsCount>
    <Cache />
</GDAL_WMS>

Now I can use my pretty tilestache maps _everywhere!_:
Image of coyote points overlayed on tilestache rendered hillshade map

Screenshot of map from postgis

edit: forgot the hattip for the lead on this, Thomas gratier: @ThomasG77


by smathermather at August 01, 2014 03:28 AM

Stephen Mather

Cleaning animal tracking data — throwing away extra points

Much the problem of the modern era– too much data, uneven data, and yet, should we keep it all?

Here’s the problem space: attach GPS collar to a coyote, send that data home, and you have a recipe for gleaning a lot of information about the movement of that animal across the landscape. In order to maximize the data collected while also maximizing the battery life of said device, sometimes (according to a preset scheme) you might sample every hour or every 15 minutes, and then switch back to once a day.

When you get the data back in, you get something like this:

Map of raw coyote data-- all points, showing all available collar GPS points

Already, we see some pretty interesting patterns. The first thing that I notice is clusters of activity. Are these clusters related to our uneven sampling, sometimes every 15 minutes, sometimes once or twice a day? Or are those really important clusters in the overall activity of the animal?

The next thing I notice is how perfectly the range of movement of the coyote is bounded by expressways to the west and north, and by the river to the east. Those seem to be pretty impermiable boundaries.

So, as does a man who only has a hammer, we clean the data with PostGIS. In this case, it’s an ideal task. First, metacode:

Task one: union / dissolve our points by day — for some days this will give us a single point, for other days a multi-point for the day.

Task two: find the centroid of our union. This will grab a centroid for each clump of points (a weighted spatial mean, if you will) and return that single point per day. There are some assumptions here that might not bear out under scrutiny, but it’s not a bad first approximation.

Now, code:

CREATE TABLE centroids AS
-- union / dissolve our points by day
WITH clusterp AS (
	SELECT ST_Union(geom) AS geom, gmt_date FROM nc_clipped_data_3734
		GROUP BY gmt_date
),
-- find the centroid of our union
centroids AS (
	SELECT ST_Centroid(geom) AS geom, gmt_date FROM clusterp
)
SELECT * FROM centroids;

Boom! One record per day:

Map showing much smaller home range, one record per day clustered around a single core zone

A different pattern now emerges. We can’t see the hard boundaries of the use area, but now the core portion of the home range can be seen.

Upcoming (guest) blog post on the use of these data in home range calculations in R. Stay tuned.

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under CC BY SA.


by smathermather at August 01, 2014 02:58 AM

July 28, 2014

A GeoSpatial World

PostGIS Viewer 3D suite

 

PosGIS Viewer 3D suite

  pg3DViewer_title6_thumb8

image_thumb175 ntroduction

Comme je vous l’annoncez dans mon précédent billet j’ai fait évoluer le pg3DViewer pour qu’il puisse gérer les géométries de type POINT, et pour mes test je me suis intéressé aux nuages de points ‘Points Clouds’'

image

 

 

 

 

 

 

 

Pour la suite de ce tutorial nous utiliserons deux fichiers de données dont voici les liens de téléchargement :

image_thumb174 nstallation

image_thumb41

Cliquez sur le lien pg3DViewer_setup.exe pour télécharger le programme d’installation de la nouvelle version, si vous aviez installé la version précédente désinstallez la, puis lancer l’exécution.

Cliquer sur le  bouton Suivant à chaque étape, puis sur le bouton Terminer.

Dans le précédent billet vous avez tous les écrans de l’installation : http://ageoguy.blogspot.fr/2014/07/postgis-3d-viewer.html

 

image_thumb179 tilisation

Avant de commencer à utiliser pg3Dviewer vous devez avoir une base de données PostgreSQL avec la cartouche spatiale PostGIS version 2.0 ou supérieure.

image_thumb61 Double cliquez sur l’icone créé par le programme d’installation sur le bureau pour lancer l’application.

image_thumb50

image_thumb2

Connecter vous à votre base de données 3D

 

image_thumb16

Commencez par cliquer sur cet icône pour vous connecter a une base de données PostgreSQL/PostGIS version 2.0 ou supérieur contenant des données 3D ou prête à en recevoir.

 
  • Choisissez localhost pour serveur si votre base de données est en local sur votre machine ou bien saisissez l’adresse IP de votre serveur
  • le port est par défaut 5432, changez le si nécessaire
  • saisissez l’utilisateur
  • saisissez le mot de passe
  • choisissez une base de donnée dans la combobox
  • cliquez sur le bout OK pour valider votre saisie.

image_thumb1711

 

Traitement d’un fichier Ascii XYZ

Nous allons utiliser le fichier bunny.xyz que vous avez téléchargé dans l’introduction.

Nous allons créer la table qui va accueillir les donnés. Dans PgAdmin III ouvrez l’outil de requêtes puis
copiez la requête suivante :

-- Table: bunny

-- DROP TABLE bunny;

CREATE TABLE bunny
(
  x double precision,
  y double precision,
  z double precision,
  id bigserial NOT NULL,
  CONSTRAINT bunny_pkey PRIMARY KEY (id)
)
WITH (
  OIDS=FALSE
);
ALTER TABLE bunny
  OWNER TO postgres;

Puis exécuter la.

image
Puis nous allons insérer les données dans la table. Copiez la requête suivante dans l’éditeur de requêtes :

copy lidar_test(x,y,z,classification)
from 'C:/temp/bunny.txt'
with delimiter ' '

puis exécuter la.

image
Maintenant nous allons pouvoir visualiser les données. Dans le panneau Query copiez la requête suivante :

SELECT x,y,z
FROM bunny AS points_xyz

Cliquez sur l’icone ‘Execute query

image
Et voici le résultat après Zoom et Rotation.
Comme vous avez du le remarquer le temps d’affichage est très rapide pour une table contenant 34835 points.














image
Nous allons maintenant essayer une nouvelle requête.Dans le panneau Query copiez la requête suivante :

SELECT st_geomfromtext('POINT Z('||x||' '||y||' '||z||')',0) as geom,'255:0:0'::text AS rgb FROM bunny

Cliquez sur l’icone ‘Execute query

image
Et voici le résultat après Zoom et Rotation.
Le temps d’affichage est beaucoup plus long parce que dans cette requête nous construisons la géométrie à la volée.

La différence avec la requête précédente est que celle-ci génère un fichier de points au format xyz (grâce à la commande COPY TO de PostgreSQL) qu’exploite un ‘reader’ de  la bibliothèque VTK. Tout cela ce fait avec l’ajout de ‘A'S points_xyz’ a la fin de la requête et la sélection des champs x,y et z.





image

Traitement d’un fichier LAS

Nous allons utiliser le fichier ‘2013 LiDAR 5105_54480.las’ que vous avez téléchargé dans l’introduction. Il va nous falloir des outils pour pouvoir importer ce fichier dans notre base de donnée, nous allons utiliser une collection d’outils qui s’appelle LASTOOLS que l’on peut télécharger sur le site de http://www.cs.unc.edu/~isenburg/lastools/ et dont voici le lien de téléchargement : http://lastools.org/download/lastools.zip

L’outil que nous allons utiliser s’appelle las2txt , voici l’entête du README associé:

****************************************************************
las2txt:
Converts from binary LAS/LAZ 1.0 - 1.4 to an ASCII text format
For updates check the website or join the LAStools mailing list.

http://rapidlasso.com/
http://lastools.org/
http://groups.google.com/group/lastools/
http://twitter.com/lastools/
http://facebook.com/lastools/
http://linkedin.com/groups?gid=4408378

Martin @lastools
****************************************************************

Et le lien pour le consulter : http://www.cs.unc.edu/~isenburg/lastools/download/las2txt_README.txt

Ouvrez une fenêtre DOS, allez dans le répertoire ou vous avez installé LASTOOLS, puis dans le sous répertoire bin. Copiez la commande suivante :
las2txt -i "C:\temp\2013 LiDAR 5105_54480.las" -parse xyzc -sep semicolon

Cette commande va générer un fichier Ascii (.txt) au format x,y,z et classification avec pour séparateur de colonne le caractère ‘;’ (point virgule).

image
Nous disposons à présent d’un fichier que nous allons pouvoir insérer dans notre base de données. Pour cela nous allons commencer par créer la table qui recevra les données, dans la fenêtre de requêtes de PgAdmin III copiez la requête suivante:

-- Table: lidar_test

-- DROP TABLE lidar_test;

CREATE TABLE lidar_test
(
  x double precision,
  y double precision,
  z double precision,
  classification integer
)
WITH (
  OIDS=FALSE
);
ALTER TABLE lidar_test
  OWNER TO postgres;

puis exécuter la.
image

Puis nous allons insérer les données dans la table. Copiez la requête suivante dans l’éditeur de requêtes :

copy lidar_test(x,y,z,classification)
from 'C:/temp/2013 LiDAR 5105_54480.txt'
with delimiter ';'

puis exécuter la.

image
Maintenant nous allons pouvoir visualiser les données. Dans le panneau Query copiez la requête suivante :

select x,y,z,
CASE
WHEN classification=2 then '128:0:0'
WHEN classification=3 then '0:64:0'
WHEN classification=4 then '0:128:0'
WHEN classification=5 then '0:255:0'
WHEN classification=6 then '255:128:64'
WHEN classification=11 then '128:128:128'
END as rgb
from lidar_test as points_xyzrgb

Cliquez sur l’icone ‘Execute query

image
Et voici le résultat
















image
Les valeurs de classification utilisées dans la requête sont issues de cette requête :
select distinct classification from lidar_test order by 1

Vous pouvez également obtenir des informations sur le fichier LAS avec l’outil lasinfo, dans une fenêtre DOS tapez la commande suivante :

lasinfo c:\temp\2013 LiDAR 5105_54480.las

a la fin du rapport vous obtenez la définition de la classification.

lasinfo for c:\temp\2013 LiDAR 5105_54480.las
reporting all LAS header entries:
  file signature:             'LASF'
  file source ID:             0
  global_encoding:            1
  project ID GUID data 1-4:   00000000-0000-0000-0000-000000000000
  version major.minor:        1.2
  system identifier:          ''
  generating software:        'TerraScan'
  file creation day/year:     156/2013
  header size:                227
  offset to point data:       447
  number var. length records: 2
  point data format:          1
  point data record length:   28
  number of point records:    7938187
  number of points by return: 6742290 914660 231891 44542 4804
  scale factor x y z:         0.001 0.001 0.001
  offset x y z:               510499.94 5447999.8799999999 0
  min x y z:                  510499.940 5447999.880 34.110
  max x y z:                  511000.110 5448500.100 275.690
variable length header record 1 of 2:
  reserved             43707
  user ID              'LASF_Projection'
  record ID            34735
  length after header  88
  description          'Georeferencing Information'
    GeoKeyDirectoryTag version 1.1.0 number of keys 10
      key 1024 tiff_tag_location 0 count 1 value_offset 1 - GTModelTypeGeoKey: ModelTypeProjected
      key 2048 tiff_tag_location 0 count 1 value_offset 4269 - GeographicTypeGeoKey: GCS_NAD83
      key 2054 tiff_tag_location 0 count 1 value_offset 9102 - GeogAngularUnitsGeoKey: Angular_Degree
      key 2056 tiff_tag_location 0 count 1 value_offset 7019 - GeogEllipsoidGeoKey: Ellipse_GRS_1980
      key 2057 tiff_tag_location 34736 count 1 value_offset 0 - GeogSemiMajorAxisGeoKey: 6378137
      key 2058 tiff_tag_location 34736 count 1 value_offset 1 - GeogSemiMinorAxisGeoKey: 6356752.314
      key 2059 tiff_tag_location 34736 count 1 value_offset 2 - GeogInvFlatteningGeoKey: 298.2572221
      key 3072 tiff_tag_location 0 count 1 value_offset 26910 - ProjectedCSTypeGeoKey: UTM 10 northern hemisphere
      key 3076 tiff_tag_location 0 count 1 value_offset 9001 - ProjLinearUnitsGeoKey: Linear_Meter
      key 4099 tiff_tag_location 0 count 1 value_offset 9001 - VerticalUnitsGeoKey: Linear_Meter
variable length header record 2 of 2:
  reserved             43707
  user ID              'LASF_Projection'
  record ID            34736
  length after header  24
  description          'Double Param Array'
    GeoDoubleParamsTag (number of doubles 3)
      6.37814e+006 6.35675e+006 298.257
reporting minimum and maximum for all LAS point record entries ...
  X                   0     500170
  Y                   0     500220
  Z               34110     275690
  intensity           0        255
  return_number       1          5
  number_of_returns   1          5
  edge_of_flight_line 0          0
  scan_direction_flag 0          1
  classification      2         11
  scan_angle_rank   -19         22
  user_data          79         79
  point_source_ID 10217      10238
  gps_time 49786940.848411 49796462.877692
WARNING: there is coordinate resolution fluff (x10) in XYZ
overview over number of returns of given pulse: 5825940 1363444 563503 160908 24392 0 0
histogram of classification of points:
         2030423  ground (2)
         1836807  low vegetation (3)
          234836  medium vegetation (4)
         1811842  high vegetation (5)
         2022021  building (6)
            2258  road surface (11)

 

image_thumb100 onclusion.

Nous sommes arrivés à la fin de ce billet, je vous conseille la lecture d’un excellent tutorial de Paul Ramsey concernant les données LIDAR que vous trouverez a cette adresse  http://workshops.boundlessgeo.com/tutorial-lidar/ , il ne vous restera plus qu’a afficher les données en utilisant les fonctions présentées dans ce ‘tutorial lidar’.

by Jérôme ROLLAND (noreply@blogger.com) at July 28, 2014 10:08 AM

July 25, 2014

A GeoSpatial World

Plugin PgAdmin III : PostGISViewer suite

Plugin PgAdmin III : PostGISViewer  suite

Nouvelle version

 PostGISViewer

Multi-géométrie

J'ai mis en place la prise en compte des multi-géométries  :
  • MULTIPOLYGON
  • MULTILINESTRING
  • MULTIPOINT

Requêtes
  
J'ai rajouté la possibilité d'exécuter les requêtes présente dans l'éditeur SQL de PgAdmin III, mais pour que cela soit possible, il faut respecter certaines règles:
  • Un seul éditeur SQL de PgAdmin III doit être ouvert.
  • Pour que la ou les requête(s) soient exécutées, il ne faut pas être positionné sur un nom de table dans le navigateur d'objets de PgAdmin III , ou alors que la table soit déjà chargée dans le visualiseur.  Une table sélectionnée prendra toujours le pas sur une ou des requêtes.
  • Le premier champ doit être  géométrique ou une géométrie issue d'une fonction de traitement géométrique (ST_Buffer, ST_Intersection....) par requête sera utilisé pour l'affichage. Ce champ ou cette géométrie sera encapsulé par la fonction AsBinary et aura pour alias geom :
    • SELECT AsBinary(wkb_geometry) as geom ....
    • SELECT AsBinary(ST_Buffer(wkb_geometry, 1.0)) as geom...
    • SELECT AsBinary(ST_Intersection(a.wkb_geometry,b.wkb_geoemetry)) as geom... 
    • ....
  •  Le second champ doit permettre de déterminer le type de géométrie, il doit donc être encapsulé par la fonction GeometryType, un alias n'est pas nécessaire :
    • SELECT ..., GeometryType(wkb_geometry)...
    • SELECT ..., GeometryType(ST_Buffer(wkb_geometry,1.0))...
    • SELECT ..., GeometryType(ST_Intersection(a.wkb_geometry,b.wkb_geometry))...
    • ....

  • Chaque requête devra se terminer par un point virgule, ce qui permettra de pouvoir exécuter plusieurs requêtes a la suite.
 Exemple

Ci-dessous deux requêtes se terminant par des points virgules :
  • La première requête va charger la commune qui a pour nom 'Sainte-Croix-de-Quintillargues'
  • La seconde requête va charger tous les bâtiments de cette commune.
 

































Après avoir lancé le plugin PostGISViewer, les deux requêtes sont exécutées et donne le résultat suivant  :





































Les couches créées portent comme nom query avec un identifiant qui s'incrémente pour toute la cession du visualiseur.

Toutes suggestions, remarques pour l'amélioration de cet outil seront les bienvenues.

A suivre...

by Jérôme ROLLAND (noreply@blogger.com) at July 25, 2014 02:48 PM

July 20, 2014

A GeoSpatial World

PostGIS 3D Viewer

 

image n viewer 3D pour PostGIS

image pg3DViewer_title[6]
 

image ntroduction

Avec la version 2.0 PostGIS est entré dans le monde de la 3D, il est possible de stocker des géométries 3D avec l’ajout de la coordonnée Z dans toutes les géométries existantes et de stocker des géométries volumétriques avec l’introduction de deux nouveaux type  :

  • TIN(Triangulated Irregular Network, une collection de TRIANGLE)
image
  • POLYHEDRALSURFACE (une collection de POLYGON)
image

Cela fait déjà un an que la création d’un viewer pour visualiser les géométries 3D de PostGIS me trottait dans la tête suite à un échange avec Claus Nagel et Felix Kunde de virtualcitySYSTEMS à propos de l’outil d’import du format CityGML que j’avais réalisé CityGML vers PostGIS 2.0 , ils m’avaient conseillé alors de me diriger vers le développement d’un viewer 3D. Lorsque récemment j’ai repris contact avec eux pour leur parler de mon développement en cours, ils ont accueilli la nouvelle avec enthousiasme et ont bénéficié de versions beta, ce qui m’a grandement aidé pour mes choix dans la finalisation de mon application.

Aujourd’hui la 3D est de plus en plus présente dans le monde SIG avec le format CityGML et dans le monde de l’architecture avec le format IFC, importer ces formats dans une base de données PosgtgreSQL/PostGIS est une démarche naturelle car cela apporte toute la puissance de traitement des bases de données spatiale pour la gestion de ces données.

Pour le format CityGML la société virtualcitySYSTEMS met à disposition un schéma de base de données Open Source et gratuit pour le stockage et la gestion de ‘3D City Models’. Ce schéma de base de données est le reflet du modèle de données CityGML. Les données peuvent être importées via un outil d’import/export .
le lien pour accéder à leur site : http://virtualcitysystems.de/en/solutions.html#3dcitydb 
image

A ce stade du développement je n’affiche que des géométries de type POLYGON, MULTIPOLYGON, TIN et POLYHEDRALSURFACE. Ce qui permet d’afficher toutes type de géométries volumétriques répondant à cette contrainte. Je vais travailler rapidement sur les autres type de géométrie POINT, LINESTRING… , ce qui fera l’objet d’une nouvelle version de l’application dans les prochaines semaines.

J’ai développé cet outil en C# 2013 avec :

  • La librairie GDAL/OGR
[image66.png] GDAL - Geospatial Data Abstraction Library
  • ActiViz .Net un wrapper pour VTK
  ActiViz .Net


vtk-logo VTK
  • PostgreSQL 9.2
(thumbnail) PostgreSQL Windows
  • PostGIS 2.2.0 dev
adbadge_wide_240 Postgis
http://winnie.postgis.net/download/windows/pg92/buildbot/postgis-pg92-binaries-2.2.0devw32.zip
http://winnie.postgis.net/download/windows/pg92/buildbot/postgis-pg92-binaries-2.2.0devw64.zip

 

Pour l’intégration de données 3D j’ai testé l’outil 3DCityDB-Importer-Exporter http://http://www.3dcitydb.org/3dcitydb/downloads/ qui permet d’importer des fichiers au format CityGML dans une base de données PostgreSQL avec PostGIS (a partir de la version 2.0).

site_3dcitydb_02 

Voici une extraction du document 3DCityDB-v2_0_6-postgis-Tutorial.pdf concernant la partie PostGIS (en anglais ) :

image

 

image

image

image

image

image

image

image

image

 

image nstallation

image

Cliquez sur le lien pg3DViewer_setup.exe pour télécharger le programme d’installation, puis lancer l’exécution.

Cliquer sur le  bouton Suivant à chaque étape, puis sur le bouton Terminer.

pg3DViewer_install_01

pg3DViewer_install_02

pg3DViewer_install_03

pg3DViewer_install_04

pg3DViewer_install_05

pg3DViewer_install_06

 

image tilisation

Avant de commencer à utiliser pg3Dviewer vous devez avoir une base de données PostgreSQL avec la cartouche spatiale PostGIS version 2.0 ou supérieure. Cette base de données doit contenir des données 3D, l’idéal serait que vous ayez créé une base 3DCityDB et importé un ou plusieurs fichiers CityGML. Voici un lien ou vous pourrez télécharger des données CityGML Rotterdam 3D, demandez à notre ami Google de traduire la page cela peut aider, cela permet de découvrir un lien sur le site permettant de télécharger un fichier pdf contenant une carte avec tous les nom des quartiers téléchargeables http://www.rotterdam.nl/GW/Images/Gemeentewerken%20Home/Landmeten/overzicht%20buurten.pdf , il ne vous reste plus qu’a choisir les quartiers dans la liste sous le lien.

Vous pouvez quand même faire les premiers tests sans avoir téléchargé de données.

image Double cliquez sur l’icone créé par le programme d’installation sur le bureau pour lancer l’application.

image

image

Connecter vous à votre base de données 3D

 

image

Commencez par cliquer sur cet icône pour vous connecter a une base de données PostgreSQL/PostGIS version 2.0 ou supérieur contenant des données 3D ou prête à en recevoir.

 
  • Choisissez localhost pour serveur si votre base de données est en local sur votre machine ou bien saisissez l’adresse IP de votre serveur
  • le port est par défaut 5432, changez le si nécessaire
  • saisissez l’utilisateur
  • saisissez le mot de passe
  • choisissez une base de donnée dans la combobox
  • cliquez sur le bout OK pour valider votre saisie.

image


Visualiser une géométrie de type POLYHEDRALSURFACE


Dans le panneau Query copiez la requête suivante :

SELECT st_force_collection(
    st_geomFromText
    (
      
'POLYHEDRALSURFACE(
        ((0 0 0, 0 1 0, 1 1 0, 1 0 0, 0 0 0)),
        ((0 0 0, 0 0 1, 0 1 1, 0 1 0, 0 0 0)),
        ((0 0 0, 1 0 0, 1 0 1, 0 0 1, 0 0 0)),
        ((0 0 1, 1 0 1, 1 1 1, 0 1 1, 0 0 1)),
        ((1 0 0, 1 1 0, 1 1 1, 1 0 1, 1 0 0)),
        ((1 1 0, 0 1 0, 0 1 1, 1 1 1, 1 1 0))
        )'
    )
)

image

Cliquez sur l’icone ‘Execute query

image
Vous obtenez ce résultat.

Interactions de la souris avec la fenêtre de rendu:
- Bouton gauche : Rotation
- Roulette          : Zoom
- Bouton droit    : Zoom dynamique
- Bouton gauche + touche majuscule : Pan
Testez les différentes associations entre les touches Majuscule, Ctrl et alt et  les boutons de la souris.

Interactions touches claviers avec la fenêtre de rendu :
- Touche w : passe en mode affichage fil de fer.
- Touche s  : passe en mode affichage surfacique.
image
La fenêtre de rendu après rotation. image


Avant de continuer une petite explication sur la requête utilisé précédemment :

SELECT st_force_collection(
    st_geomFromText
    (
       
'POLYHEDRALSURFACE(
        ((0 0 0, 0 1 0, 1 1 0, 1 0 0, 0 0 0)),
        ((0 0 0, 0 0 1, 0 1 1, 0 1 0, 0 0 0)),
        ((0 0 0, 1 0 0, 1 0 1, 0 0 1, 0 0 0)),
        ((0 0 1, 1 0 1, 1 1 1, 0 1 1, 0 0 1)),
        ((1 0 0, 1 1 0, 1 1 1, 1 0 1, 1 0 0)),
        ((1 1 0, 0 1 0, 0 1 1, 1 1 1, 1 1 0))
        )'

    )
)

Comme vous avez du le remarquer la fonction ST_GeomFromText est encadré par la fonction ST_Force_Collection. Pourquoi rajouter cette fonction. Si vous exécuter cette requête sous pgAdmin III en encadrant le tout par ST_AsText vous obtenez comme résultat ceci :
"GEOMETRYCOLLECTION Z (
POLYGON Z ((0 0 0,0 1 0,1 1 0,1 0 0,0 0 0)),
POLYGON Z ((0 0 0,0 0 1,0 1 1,0 1 0,0 0 0)),
POLYGON Z ((0 0 0,1 0 0,1 0 1,0 0 1,0 0 0)),
POLYGON Z ((0 0 1,1 0 1,1 1 1,0 1 1,0 0 1)),
POLYGON Z ((1 0 0,1 1 0,1 1 1,1 0 1,1 0 0)),
POLYGON Z ((1 1 0,0 1 0,0 1 1,1 1 1,1 1 0)))
"
Nous sommes passé d’une géométrie de type POLYHEDRALSURFACE à une géométrie de type GEOMETRYCOLLECTION Z contenant des POLYGON Z, mais rappelez vous une géométrie de type POLYHEDRALSURFACE est aussi une collection de POLYGON donc jusque la tout va bien.
Revenons à la question initiale pourquoi utiliser la fonction ST_Force_Collection, tout simplement parce la librairie GDAL que j’utilise dans mon développement ne reconnait pas encore les géométries de type POLYHDERALSURFACE. Cette astuce permet de contourner la limitation présente.

Visualiser une géométrie de type TIN

Dans le panneau Query copiez la requête suivante :

WITH res AS
(
SELECT st_polygon(
    st_boundary(
        (st_dump(
            st_geomFromText
            (
          
'TIN(
            ((0 0 0, 1 0 0, 0 1 0, 0 0 0)),
            ((0 0 0, 1 0 0, 0 0 1, 0 0 0)),
            ((0 0 0, 0 0 1, 0 1 0, 0 0 0)),
            ((0 0 1, 1 0 0, 0 1 0, 0 0 1))
            )'

            )
        )).geom
    )
    ,0
    ) as geom
)
SELECT st_forceCollection(st_collect(geom))
FROM res

image

Cliquez sur l’icone ‘Execute query

Vous obtenez ce résultat après rotation.

image


La requête utilisée nécessite une explication :

WITH res AS
(
SELECT st_polygon(
    st_boundary(
        (st_dump(
            st_geomFromText
            (
          
'TIN(
            ((0 0 0, 1 0 0, 0 1 0, 0 0 0)),
            ((0 0 0, 1 0 0, 0 0 1, 0 0 0)),
            ((0 0 0, 0 0 1, 0 1 0, 0 0 0)),
            ((0 0 1, 1 0 0, 0 1 0, 0 0 1))
            )'

            )
        )).geom
    )
    ,0
    ) as geom
)
SELECT st_forceCollection(st_collect(geom))
FROM res

Comme pour la requête concernant une géométrie de type POLYHEDRALSURFACE, les géométries de type TIN ne sont pas encore reconnues par la librairie GDAL. Pour rappel une géométrie de type TIN est aussi une collection de TRIANGLE que ne reconnait pas la librairie GDAL ce qui explique la requête légèrement plus complexe.
Si vous exécutez la requête suivante sous PgAdmin III :

SELECT St_AsText(st_force_collection(
    st_geomFromText
    (
         'TIN(
            ((0 0 0, 1 0 0, 0 1 0, 0 0 0)),
            ((0 0 0, 1 0 0, 0 0 1, 0 0 0)),
            ((0 0 0, 0 0 1, 0 1 0, 0 0 0)),
            ((0 0 1, 1 0 0, 0 1 0, 0 0 1))
            )
'
    ))
)

Vous obtenez ceci :

"GEOMETRYCOLLECTION Z (
TRIANGLE Z ((0 0 0,1 0 0,0 1 0,0 0 0)),
TRIANGLE Z ((0 0 0,1 0 0,0 0 1,0 0 0)),
TRIANGLE Z ((0 0 0,0 0 1,0 1 0,0 0 0)),
TRIANGLE Z ((0 0 1,1 0 0,0 1 0,0 0 1)))
"

Que ne reconnait pas la librairie GDAL, ce qui explique la première partie de la requête qui recrée des géométries de type POLYGON à partir des géométries de type TRIANGLE. La seconde partie de la requête (ou l’on retrouve la fonction ST_Force_Collection) recrée une GEOMETRYCOLLECTION Z contenant des POLYGON Z donc interprétable par la librairie GDAL.

Pour simplifier les requêtes sur des géométries de type TIN, la création d’une fonction opérant la transformation me parait indispensable.

 

Fonction passant d’une géométrie de type TIN à une géométrie de type GEOMETRYCOLLECTION Z

-- Function: ST_TinToGeomCollection(geometry, INTEGER)

-- DROP FUNCTION ST_TinToGeomCollection(geometry, INTEGER);

CREATE OR REPLACE FUNCTION ST_TinToGeomCollection(geometry, integer)
  RETURNS geometry AS
$BODY$
DECLARE
    sql text;
    rec record;
BEGIN 
    sql:=
'WITH res AS
         (
          SELECT st_polygon(st_boundary((st_dump(geometry('||quote_literal($1::text)||'))).geom),'||$2||') as geom
          )
         SELECT st_forceCollection(st_collect(geom)) as geom FROM res
';
    FOR rec IN EXECUTE sql LOOP
       RETURN rec.geom;
   
END LOOP;
END
;
$BODY$
  LANGUAGE plpgsql VOLATILE STRICT
  COST 100;
ALTER FUNCTION ST_TinToGeomCollection(geometry,integer)
  OWNER TO postgres;
GRANT EXECUTE ON FUNCTION ST_TinToGeomCollection(geometry, integer) TO public;
GRANT EXECUTE ON FUNCTION ST_TinToGeomCollection(geometry, integer) TO postgres;

Vous allez créer la fonction dans votre base de données 3D :

  • Copier le contenu du cadre ci-dessus et collez dans la fenêtre de requêtes de PgAdmin III
  • lancez l’exécution, la fonction est créée.

la requête pour l’affichage de géométrie de type TIN devient alors:

select st_astext(ST_TinToGeomCollection(
st_geomFromText
            (
            'TIN(
            ((0 0 0, 1 0 0, 0 1 0, 0 0 0)),
            ((0 0 0, 1 0 0, 0 0 1, 0 0 0)),
            ((0 0 0, 0 0 1, 0 1 0, 0 0 0)),
            ((0 0 1, 1 0 0, 0 1 0, 0 0 1))
            )'

            ) ,0))
le second paramètre de la fonction ST_TinToGeomCollection étant le SRID de la géométrie, 4326 par exemple si vous êtes dans un système de coordonnées WGS84. Si vous ne le connaissez pas saisissez la valeur 0.

Affichage de données dans une base 3DCityDB contenant des données au format CityGML

Dans le panneau Query copiez la requête suivante :

WITH wallsurface as
(
SELECT a.id,c.root_id,st_collect(c.geometry) AS g,'255:128:0'::text AS rgb
FROM building a,thematic_surface b, surface_geometry c
WHERE b.building_id=a.id
AND b.type='WallSurface'
AND c.root_id=b.lod2_multi_surface_id
GROUP BY a.id,c.root_id
), roofsurface as
(
SELECT a.id,c.root_id,st_collect(c.geometry) AS g,'255:0:0'::text AS rgb
FROM building a,thematic_surface b, surface_geometry c
WHERE b.building_id=a.id
AND b.type='RoofSurface'
AND c.root_id=b.lod2_multi_surface_id
GROUP BY a.id,c.root_id
)
select id,root_id,g as geom,rgb
from wallsurface
union all
select id,root_id,g as geom,rgb
FROM roofsurface
ORDER BY id

image

Cliquez sur l’icone ‘Execute query

image
Dans la requête il est possible d’affecter une couleur à un type de géométrie, la syntaxe est le code R:G:B entre quotte suivi de AS rgb :
’255:128:0’ AS rgb  (wallsurface)
‘255:0:0’     AS rgb  (rootsurface)
image

 

Interface

   
image  
image Sortie de l’application.
image Connexion à une base de données.
   
image Panneau Requête
image Remet à blanc la fenêtre de requêtes
image Ouvre un fichier requête, qui ne doit contenir qu’une seule requête.
image Exécution de la requête.
image Collapse le panneau sur la gauche.
   
image Panneau affichage 3D
image Change la cuouleur du fond, Blanc ou Noir.
image Zoom Full.
image Zoom Plus.
image Zoom Moins.
image Efface le contenu de la fenêtre de rendu, toutes les géométries précédemment chargées sont supprimées.
   
image Panneau données attributaires
image Collapse le panneau sur le bas.
   
   
    

image onclusion.

Ce billet met à votre disposition un outil permettant de visualiser des géométries 3D stockées dans une base de données PostGIS 2.0 ou supérieure. Il me reste à implémenter l’affichage des points et des lignes.

Toutes suggestions d’amélioration seront les bienvenues, j’attends vos retours.

by Jérôme ROLLAND (noreply@blogger.com) at July 20, 2014 04:16 PM

A GeoSpatial World

PostGIS 3D Viewer introduction

 

image_thumb178 n viewer 3D pour PostGIS

image_thumb7[1] pg3DViewer_title6_thumb8
 

image_thumb175 ntroduction

Avec la version 2.0 PostGIS est entré dans le monde de la 3D, il est possible de stocker des géométries 3D avec l’ajout de la coordonnée Z dans toutes les géométries existantes et de stocker des géométries volumétriques avec l’introduction de deux nouveaux type  :

  • TIN(Triangulated Irregular Network, une collection de TRIANGLE)
image_thumb13
  • POLYHEDRALSURFACE (une collection de POLYGON)
image_thumb101

Cela fait déjà un an que la création d’un viewer pour visualiser les géométries 3D de PostGIS me trottait dans la tête suite à un échange avec Claus Nagel et Felix Kunde de virtualcitySYSTEMS à propos de l’outil d’import du format CityGML que j’avais réalisé CityGML vers PostGIS 2.0 , ils m’avaient conseillé alors de me diriger vers le développement d’un viewer 3D. Lorsque récemment j’ai repris contact avec eux pour leur parler de mon développement en cours, ils ont accueilli la nouvelle avec enthousiasme et ont bénéficié de versions beta, ce qui m’a grandement aidé pour mes choix dans la finalisation de mon application.

Aujourd’hui la 3D est de plus en plus présente dans le monde SIG avec le format CityGML et dans le monde de l’architecture avec le format IFC, importer ces formats dans une base de données PosgtgreSQL/PostGIS est une démarche naturelle car cela apporte toute la puissance de traitement des bases de données spatiale pour la gestion de ces données.

Pour le format CityGML la société virtualcitySYSTEMS met à disposition un schéma de base de données Open Source et gratuit pour le stockage et la gestion de ‘3D City Models’. Ce schéma de base de données est le reflet du modèle de données CityGML. Les données peuvent être importées via un outil d’import/export .

Affichage de données CityGML dans le viewer 3D

image_thumb14[6]

 

Continuer la lecture/Continue reading “PostGIS 3D Viewer”

by Jérôme ROLLAND (noreply@blogger.com) at July 20, 2014 04:12 PM

July 15, 2014

Stephen Mather

LiDAR and pointcloud extension pt 6

There’s all sorts of reasons why what I’ve been doing so far is wrong (see the previous 5 posts).  More on that later. But in case you needed 3D visualization of the chipping I’ve been working through, look no further:

Isometric 3D visualization of 3D chipping


by smathermather at July 15, 2014 12:32 AM

July 13, 2014

Boston GIS (Regina Obe, Leo Hsu)

PostGIS Minimalist X3D Viewer for PHP and ASP.NET

I've been thinking for a long time about creating an HTML based X3D viewer to visualize PostGIS 3D geometries. Now that I've started to document the new SFCGAL features, the need became more pressing. So I've created a very rudimentary one. You can check out the code on my github page https://github.com/robe2/postgis_x3d_viewer. I'm hoping to expand it in the future to allow people to pick materials for their geometries and also allow people to save the generated scene as a full X3D document.

x3d viewer for postgis

It utilizes the X3DOM Javascript library and JQuery for injecting X3D objects into the scene. I'm really excited about X3DOM JS. It was fairly trivial to work with and integrate the ST_AsX3D function. Utilizing it will help me stress test the ST_AsX3D function and patch up the holes in it. When I developed the function I was only interested in 3D geometries and didn't think too much about collections either. So as a result the function is kinda broken when working with 2D or Geometry Collections.

Features and Limitations

  • It has one scene and every query you do writes into the scene. This allows you to dump a bunch of objects on the same white board using different queries.
  • Similar to Raster Viewer I created a while ago, it's got a query log and a color picker. The color picker allows you to color each query output differently. I hope to augment this to allow picking different materials too.
  • Requires either .NET or PHP and of course a PostGIS 2+ enabled database
  • Similar to the Web raster viewer I created, it can only handle a query that outputs a single geometry (geometry mode) or an X3D scene snippet (raw mode)

Continue reading "PostGIS Minimalist X3D Viewer for PHP and ASP.NET"

by Regina Obe (nospam@example.com) at July 13, 2014 02:15 AM

July 12, 2014

Stephen Mather

LiDAR and pointcloud extension pt 5

Now for the crazy stuff:

Plaid-like overlay of vertical patches

The objective is to allow us to do vertical and horizontal summaries of our data. To do this, we’ll take chipped LiDAR input and further chip it vertically by classifying it.

First a classifier for height that we’ll use to do vertical splits on our point cloud chips:

CREATE OR REPLACE FUNCTION height_classify(numeric)
  RETURNS integer AS
$BODY$
 
SELECT
	CASE WHEN $1 < 10 THEN 1
	     WHEN $1 >= 10 AND $1 < 20 THEN 2
	     WHEN $1 >= 20 AND $1 < 30 THEN 3
	     WHEN $1 >= 30 AND $1 < 40 THEN 4
	     WHEN $1 >= 40 AND $1 < 50 THEN 5
	     WHEN $1 >= 50 AND $1 < 60 THEN 6
	     WHEN $1 >= 60 AND $1 < 70 THEN 7
	     WHEN $1 >= 70 AND $1 < 80 THEN 8
	     WHEN $1 >= 80 AND $1 < 90 THEN 9
	     WHEN $1 >= 90 AND $1 < 100 THEN 10
	     WHEN $1 >= 100 AND $1 < 110 THEN 11
	     WHEN $1 >= 110 AND $1 < 120 THEN 12
	     WHEN $1 >= 120 AND $1 < 130 THEN 13
	     WHEN $1 >= 130 AND $1 < 140 THEN 14
	     WHEN $1 >= 140 AND $1 < 150 THEN 15
	     WHEN $1 >= 150 AND $1 < 160 THEN 16
	     WHEN $1 >= 160 AND $1 < 170 THEN 17	     
	     WHEN $1 >= 170 AND $1 < 180 THEN 18
	     ELSE 0
	END
	FROM test;
 
$BODY$
  LANGUAGE sql VOLATILE
  COST 100;

And now, let’s pull apart our point cloud, calculate heights from approximate ground, then put it all back together in chips grouped by height classes and aggregates of our original chips. Yes, I will explain more later. And don’t do this at home– I’m not even sure if it makes sense yet… :

/*
First we explode the patches into points,
 and along the way grab the minimum value
for the patch, and the id
*/
WITH lraw AS (
    SELECT PC_Explode(pa) AS ex,  PC_PatchMin(pa, 'Z') AS min, id
    FROM test1
    ),
/*
Calculate our height value ( Z - min )
*/
heights AS (
    SELECT PC_Get(ex, 'X') || ',' || PC_Get(ex, 'Y') || ',' -- grab X and Y
    || PC_Get(ex, 'Z') - min                -- calculate height
    || ',' ||
-- And return our remaining dimensions
    PC_Get(ex, 'Intensity') || ',' || PC_Get(ex, 'ReturnNumber') || ',' || PC_Get(ex, 'NumberOfReturns') || ',' ||
    PC_Get(ex, 'ScanDirectionFlag') || ',' || PC_Get(ex, 'EdgeOfFlightLine') || ',' ||
    PC_Get(ex, 'Classification') || ',' || PC_Get(ex, 'ScanAngleRank') || ',' || PC_Get(ex, 'UserData') || ',' ||
    PC_Get(ex, 'PointSourceId') || ',' || PC_Get(ex, 'Time') || ',' || PC_Get(ex, 'PointID') || ',' || PC_Get(ex, 'BlockID')
    AS xyheight, PC_Get(ex, 'Z') - min AS height, id FROM lraw
    ),
-- Now we can turn this aggregated text into an array and then a set of points
heightcloud AS (
    SELECT PC_MakePoint(1, string_to_array(xyheight, ',')::float8[]) pt, height_classify(height) heightclass, id FROM heights
    )
-- Finally, we bin the points back into patches grouped by our heigh class and original ids.
SELECT PC_Patch(pt), id / 20 AS id, heightclass FROM heightcloud GROUP BY id / 20, heightclass;

The resultant output should allow us to query our database by height above ground and patch. Now we can generate vertical summaries of our point cloud. Clever or dumb? It’s too early to tell.


by smathermather at July 12, 2014 01:56 AM

July 11, 2014

Stephen Mather

LiDAR and pointcloud extension pt 4

Height cloud overlayed with patchs

Now, we navigate into unknown (and perhaps silly) territory. Now we do point level calculations of heights, and drop the new calculated points back into the point cloud as though it were a Z value. I don’t recommend this code as a practice– this is as much as me thinking aloud as anything Caveat emptor:

/*
First we explode the patches into points,
 and along the way grab the minimum value
for the patch, and the id
*/
WITH lraw AS (
	SELECT PC_Explode(pa) AS ex,  PC_PatchMin(pa, 'Z') AS min, id
	FROM test1
	),
/*
Calculate our height value ( Z - min ) 
*/
heights AS (
	SELECT PC_Get(ex, 'X') || ',' || PC_Get(ex, 'Y') || ',' -- grab X and Y
	|| PC_Get(ex, 'Z') - min				-- calculate height
	|| ',' ||
-- And return our remaining dimensions
	PC_Get(ex, 'Intensity') || ',' || PC_Get(ex, 'ReturnNumber') || ',' || PC_Get(ex, 'NumberOfReturns') || ',' ||
	PC_Get(ex, 'ScanDirectionFlag') || ',' || PC_Get(ex, 'EdgeOfFlightLine') || ',' ||
	PC_Get(ex, 'Classification') || ',' || PC_Get(ex, 'ScanAngleRank') || ',' || PC_Get(ex, 'UserData') || ',' ||
	PC_Get(ex, 'PointSourceId') || ',' || PC_Get(ex, 'Time') || ',' || PC_Get(ex, 'PointID') || ',' || PC_Get(ex, 'BlockID')
	AS xyheight, id FROM lraw
	),
-- Now we can turn this aggregated text into an array and then a set of points
heightcloud AS (
	SELECT PC_MakePoint(1, string_to_array(xyheight, ',')::float8[]) pt, id FROM heights
	)
-- Finally, we bin the points back into patches
SELECT PC_Patch(pt) FROM heightcloud GROUP BY id / 20;

by smathermather at July 11, 2014 03:11 AM

July 09, 2014

Stephen Mather

LiDAR and pointcloud extension pt 3

Digging a little deeper. Ran the chipper on a smaller number of points and then am doing a little hacking to get height per chip (if you start to get lost, go to Paul Ramsey’s tutorial).

Here’s my pipeline file. Note the small chipper size– 20 points per chip.

<?xml version="1.0" encoding="utf-8"?>
<Pipeline version="1.0">
  <Writer type="drivers.pgpointcloud.writer">
    <Option name="connection">dbname='pggis' user='pggis' host='localhost'</Option>
    <Option name="table">test1</Option>
    <Option name="srid">3362</Option>
    <Filter type="filters.chipper">
      <Option name="capacity">20</Option>
      <Filter type="filters.cache">
        <Reader type="drivers.las.reader">
          <Option name="filename">60002250PAN.las</Option>
          <Option name="spatialreference">EPSG:3362</Option>
        </Reader>
      </Filter>
    </Filter>
  </Writer>
</Pipeline>

Easy enough to load (though slow for the sake of the chip size):

pdal pipeline las2pg.xml

Now we can do some really quick and dirty height calculations per chip — like what’s the difference between the minimum and maximum value in the chip:

DROP TABLE IF EXISTS test1_patches;

CREATE TABLE patch_heights AS
SELECT
  pa::geometry(Polygon, 3362) AS geom,
  PC_PatchAvg(pa, 'Z') AS elevation,
  PC_PatchMax(pa, 'Z') - PC_PatchMin(pa, 'Z') AS height
FROM test1;

Patch heights showing vegetation and open areas


by smathermather at July 09, 2014 02:55 AM