Related Plugins and Tags

QGIS Planet

PostGIS 3D – Foss4g video and workshop

The latest PostGIS and QGIS 3D enhancements presented at FOSS4G by Oslandia are available online.We suggest you to have a look on our PostGIS 3D / QGIS 3D video demonstration using SFCGAL library and the QGIS Horao plugin.

A step by step workshop, (really close to the video workflow) is also available online  https://github.com/Oslandia/Workshops/tree/master/FOSS4G_2013_PostGIS_3D

We can provide you the full virtual machine on demand, with proper software environment (6GB Virtual Box Image).

We would be really interested in having your advice on these new 3D features, and the use cases you could be interested in. Do not hesitate to get in touch.

Contact us at [email protected] for any information.

QGIS Community meeting in Brighton

Developers and contributors from the QGIS project are used to gather physically twice a year across different countries. Such an event allows people to synchronize their effort, and discuss new possible developments.cThe latest QGIS community meeting took place in Brighton from the 12th to the 16th of September, just before the FOSS4G event. It was the biggest community meeting organized so far, with close to 50 people attending ! Everything went smooth thanks to the perfect organization by Lutra Consulting.

This session was of particular interest in the project’s history, since it was dedicated to the release of the eagerly-awaited new 2.0 version of QGIS.

Oslandia is used to take part in the event and even organized the march 2012 session in Lyon.


Presentations

Despite being originally oriented toward code and translations, some presentations took place during the event. Some of them have been video recorded, some did not. Hereafter is a subset of them.

A new website

In parallel to the release of the 2.0 version, the QGIS website has been updated. Its look and feel, but also the way it is now build. Richard Duivenvoorde presented the efforts that have been put on the support of multiple languages, adaptation to mobile devices, and the reuse of tools used for building the documentation of the project. The new website is now online.

Richard presenting the new website

 

Presentation of the new website : http://www.ustream.tv/recorded/38687971

Constraints on attributes

Some more developer-oriented presentations and discussions also took place. Matthias Kuhn and Nathan Woodrow presented an idea about extending the way attributes are handled by QGIS. In particular, the concept of constrained attributes emerged. The idea is to be able to express, manipulate and edit contrains on attributes (possible range of values for instance) as it is found in databases. This could then be used to constrain user editing of layers, presenting to the user an appropriate widget (combo box for an enumeration for instance), especially for layers that do not have native support for these constraints.

QGIS for Android tablets

RealworldSystems presented their work on what they called the “QGIS Mobility framework”, based on previous works by Marco Bernasocchi on QGIS for Android. It is dedicated to the design of custom QGIS applications for deployment on Android tablets (for on-the-field editing campains for instance). It looks promising and has already been used in a real-world application for gaz pipeline inspection. The framework can be found on github.

QGIS webserver

Andreas Neumann presented evolutions of QGIS webserver and webclient. More can be found in the corresponding video.

Andreas presenting the work on QGIS webserver and webclient

Video 1 http://www.ustream.tv/recorded/38741015

Evolution of the Globe plugin

Matthias Kuhn presented evolutions he made to the Globe plugin that allows to display a 3D earth with different kinds of data on it. Lots of osgearth features are now integrated into the Globe plugin (in particular the support for 2D vector layers).

Matthias presenting its work on the Globe plugin

Video 2 http://www.ustream.tv/recorded/38737991

Visualisation of 3D data

Oslandia presented also its ongoing work on the integration of Postgis 3D. After a thourought evaluation of osgearth, which is the base of the Globe plugin, we decided to develop our own 3D visualisation stack directly on top of OpenSceneGraph.

A QGIS plugin has also been developed in order to be able to view QGIS layers in 3D.

With this new 3D visualisation stack we are able to display and manipulate data of a whole city between 20 and 60 frames per second on a laptop (here the demo has been designed on data from the city of Lyon) , when we were hardly able to display a small city quarter with Globe.

Oslandia presenting its work on its 3D visualisation stack

Video 3 http://www.ustream.tv/recorded/38738897

Slides https://github.com/Oslandia/presentations/tree/master/qgis_hf_2013

QGIS 2.0

All the work done during this community meeting allowed to polish the 2.0 version of QGIS which has been publicly announced during the FOSS4G in Nottingham by Tim Sutton.
Waiting now for the 2.1 release 🙂

Movement data in GIS: issues & ideas

Since I’ve started working, transport and movement data have been at the core of many of my projects. The spatial nature of movement data makes it interesting for GIScience but typical GIS tools are not a particularly good match.

Dealing with the temporal dynamics of geographic processes is one of the grand challenges for Geographic Information Science. Geographic Information Systems (GIS) and related spatial analysis methods are quite adept at handling spatial dimensions of patterns and processes, but the temporal and coupled space-time attributes of phenomena are difficult to represent and examine with contemporary GIS. (Dr. Paul M. Torrens, Center for Urban Science + Progress, New York University)

It’s still a hot topic right now, as the variety of related publications and events illustrates. For example, just this month, there is an Animove two-week professional training course (18–30 September 2016, Max-Planck Institute for Ornithology, Lake Konstanz) as well as the GIScience 2016 Workshop on Analysis of Movement Data (27 September 2016, Montreal, Canada).

Space-time cubes and animations are classics when it comes to visualizing movement data in GIS. They can be used for some visual analysis but have their limitations, particularly when it comes to working with and trying to understand lots of data. Visualization and analysis of spatio-temporal data in GIS is further complicated by the fact that the temporal information is not standardized in most GIS data formats. (Some notable exceptions of formats that do support time by design are GPX and NetCDF but those aren’t really first-class citizens in current desktop GIS.)

Most commonly, movement data is modeled as points (x,y, and optionally z) with a timestamp, object or tracker id, and potential additional info, such as speed, status, heading, and so on. With this data model, even simple questions like “Find all tracks that start in area A and end in area B” can become a real pain in “vanilla” desktop GIS. Even if the points come with a sequence number, which makes it easy to identify the start point, getting the end point is tricky without some custom code or queries. That’s why I have been storing the points in databases in order to at least have the powers of SQL to deal with the data. Even so, most queries were still painfully complex and performance unsatisfactory.

So I reached out to the Twitterverse asking for pointers towards moving objects database extensions for PostGIS and @bitnerd, @pwramsey, @hruske, and others replied. Amongst other useful tips, they pointed me towards the new temporal support, which ships with PostGIS 2.2. It includes the following neat functions:

  • ST_IsValidTrajectory — Returns true if the geometry is a valid trajectory.
  • ST_ClosestPointOfApproach — Returns the measure at which points interpolated along two lines are closest.
  • ST_DistanceCPA — Returns the distance between closest points of approach in two trajectories.
  • ST_CPAWithin — Returns true if the trajectories’ closest points of approach are within the specified distance.

Instead of  points, these functions expect trajectories that are stored as LinestringM (or LinestringZM) where M is the time dimension. This approach makes many analyses considerably easier to handle. For example, clustering trajectory start and end locations and identifying the most common connections:

animation_clusters

(data credits: GeoLife project)

Overall, it’s an interesting and promising approach but there are still some open questions I’ll have to look into, such as: Is there an efficient way to store additional info for each location along the trajectory (e.g. instantaneous speed or other status)? How well do desktop GIS play with LinestringM data and what’s the overhead of dealing with it?


Getting multipolygon vertexes using PostGIS

EN | PT

Today I needed to create a view in PostGIS that returned the vertexes of a multi-polygon layer. Besides, I needed that they were numerically ordered starting in 1, and with the respective XY coordinates.

Screenshot from 2015-11-05 23:58:19

It seemed to be a trivial task – all I would need was to use the ST_DumpPoints() function to get all vertexes – if it wasn’t for the fact that PostGIS polygons have a duplicate vertex (the last vertex must be equal to the first one) that I have no interess in showing.

After some try-and-fail, I came up with the following query:

CREATE OR REPLACE VIEW public.my_polygons_vertexes AS
WITH t AS -- Transfor polygons in sets of points
    (SELECT id_polygon,
            st_dumppoints(geom) AS dump
     FROM public.my_polygons),
f AS -- Get the geometry and the indexes from the sets of points 
    (SELECT t.id_polygon,
           (t.dump).path[1] AS part,
           (t.dump).path[3] AS vertex,
           (t.dump).geom AS geom
     FROM t)
-- Get all points filtering the last point for each geometry part
SELECT row_number() OVER () AS gid, -- Creating a unique id
       f.id_polygon,
       f.part,
       f.vertex,
       ST_X(f.geom) as x, -- Get point's X coordinate
       ST_Y(f.geom) as y, -- Get point's Y coordinate
       f.geom::geometry('POINT',4326) as geom -- make sure of the resulting geometry type
FROM f 
WHERE (f.id_polygon, f.part, f.vertex) NOT IN
      (SELECT f.id_polygon,
              f.part,
              max(f.vertex) AS max
       FROM f
       GROUP BY f.id_polygon,
                f.part);

The interesting part occurs in the WHERE clause, basically, from the list of all vertexes, only the ones not included in the list of vertexes with the maximum index by polygon part are showed, that is, the last vertex of each polygon part.

Here’s the result:

Screenshot from 2015-11-05 23:58:40

The advantage of this approach (using PostGIS) instead of using “Polygons to Lines” and “Lines to points” processing tools is that we just need to change the polygons layer, and save it, to see our vertexes get updated automatically. It’s because of this kind of stuff that I love PostGIS.

Labels leading lines with QGIS and Postgis

EN | PT

Recently I had the need to add labels to features with very close geometries, resulting in their collision.

Capturar_3

Using data-defined override for label’s position (I have used layer to labeled layer plugin to set this really fast) and the QGIS tool to move labels, it was quite easy to relocate them to better places. However, in same cases, it was difficult to understand to which geometry they belonged.

Capturar_2

I needed some kind of leading lines to connect, whenever necessary, label and feature. I knew another great plugin called “Easy Custom Labeling“, by Regis Haubourg, that did what I needed, but it would create a memory duplicate of the original layer, wish meant that any edition on the original layer wouldn’t be updated in the labels.

Since the data were stored in a PostgreSQL/Postgis database, I have decided to create a query that would return a layer with leading lines. I used the following query in DB manager:

SELECT
  gid,
  label,
  ST_Makeline(St_setSRID(ST_PointOnSurface(geom),27493), St_setSRID(St_Point(x_label::numeric, y_label::numeric),27493))
FROM
  epvu.sgev
WHERE
  x_label IS NOT NULL AND
  y_label IS NOT NULL AND
  NOT ST_Within(ST_Makeline(St_setSRID(ST_PointOnSurface(geom),27493), St_setSRID(St_Point(x_label::numeric, y_label::numeric),27493)),geom))

This query creates a line by using the feature centroid as starting point and the label coordinate as end point. The last condition on the WHERE statement assures that the lines are only created for labels outside the feature.

Capturar_1

With the resulting layer loaded in my project, all I need is to move my labels and save the edition (and press refresh) to show a nice leading line.

guidelines

A new QGIS tool (based on ogr2ogr) to import vectors in PostGIS, the fast way

In QGIS there are many tools that can be used to import vectors inside a PostGIS database, each one has pros and cons: SPIT core plugin: available since long ago but now seems to be a unmaintained tool and therefore will be probably removed in a future QGIS release. It  has the advantage to allow […]

Dissolver polígonos em Postgres\Postgis

Trata-se de um cenário muito recorrente em análise espacial. Tendo uma camada\tabela composta por diversos polígonos, queremos “juntá-los” de acordo com valores distintos de um ou mais atributos (exemplo: de uma camada com os limites de freguesias, queremos obter os concelhos, ou, da COS ao 3º nível, obter o 2º ou o 1º)

Este artigo tem como objectivo mostrar como fazê-lo em Postgres\Postgis.

Tabela de exemplo

Como exemplo vou usar uma tabela como o seguinte formato:

CREATE TABLE tabela_1
    (gid serial PRIMARY KEY,
     campo1 character varying(128),
     campo2 integer,
     geom geometry(MultiPolygon,27493);

tabela1_original_tabela

tabela1_original

Dissolver todos os polígonos

Em primeiro lugar podemos simplesmente agregar todos os elementos num multi-polígono único. Para tal usamos a função ST_Union().

SELECT
    ST_Union(t.geom) as geom
FROM
    tabela_1 as t;

tabela1_union

Separar polígonos que não sejam contíguos

Se por outro lado não quisermos que o resultado apresente multi-polígonos usamos a função ST_Dump() recolhendo o campo da geometria.

SELECT
    (ST_Dump(ST_Union(t.geom))).geom as geom
FROM
    tabela_1 as t;

tabela1_union_dump

Dissolver polígonos com base em valores dos campos

Se quisermos dissolver os polígonos que tenham valores iguais num ou mais campos, basta incluí-los na cláusula GROUP BY. Se quisermos que esses campos apareçam no resultado (geralmente queremos) há que referi-los no início do SELECT.

SELECT
    campo1,
    campo2,
    (ST_Dump(ST_Union(t.geom))).geom as geom
FROM
    tabela_1 as t
GROUP BY
    campo1,
    campo2;

tabela1_union_by_value

Nota 1: Para quem prefere usar interfaces gráficos, preencher formulários e clicar em botões, o uso de SQL para fazer este tipo de operações pode parecer demasiado complicado e até um pouco retrógrado. Mas uma coisa garanto, com alguma prática as dificuldades iniciais são ultrapassadas e os benefícios que se retiram deste tipo de abordagem são muito recompensadores.

Nota 2: Visualizar o resultado deste tipos consultas de agregação (que usam a cláusula GROUP BY) no QGIS pode ser desafiante, este artigo explica como ultrapassar essa dificuldade.


Training courses calendar: QGIS (Desktop, Server and Web) and PostGIS

We just published our Training Courses calendar for the period September 2014 – January 2015. This includes training courses about QGIS (Desktop, Server and Web) and PostgreSQL/PostGIS in both Italy and Portugal. Training courses about QGIS python programming are available on demand. For details (locations, prices, discounts, etc.) about training courses in Portugal see: http://www.faunalia.eu/pt/training.html […]

Packaging PostGIS dev releases with Docker

Packaging PostGIS dev releases with Docker

We recently added support for GML curves to PostGIS, which enables TinyOWS to deliver WFS requests with curve geometries. More on this in a later post. This enhancement is in the current PostGIS developement version (SVN master) and not released yet. To enable our customer testing this functionality, we had to build packages for their server environment which is Ubuntu Precise with UbuntuGIS repositories. After working with Linux LXC containers and it's predecessor VServer for years, Docker was a logical choice for a clean reproducible build environment.

Rebuilding a Debian package is usually quite easy:

apt-get build-dep <package>
apt-get source <package>
cd <packagedir>
#Make your changes
dch -i
dpkg-buildpackage

But getting build dependencies for PostGIS currently fails with libssl-dev conflicts, maybe because the dev packages got out of sync after the recent Heartblead updates. So the Dockerfile uses equivs to build a dummy package which satisfies the dependencies.

The command

docker run -v /tmp:/pkg sourcepole/postgis-svn-build-env sh -c 'cp /root/*postgis*.deb /pkg'

loads the Docker image with packages built from the latest SVN version of PostGIS in /root and copies the deb files from the containter into /tmp.

Now we're ready to install these packages on the Ubuntu server:

sudo dpkg -i /tmp/*postgis*.deb

Thats it. Feedback welcome!

@PirminKalberer

P.S.

If you happen to be a developer, then you may prefer running a cutting-edge version of PostGIS in a Docker container instead of building packages. Our colleagues from Oslandia just published how to do this.

Packaging PostGIS dev releases with Docker

Packaging PostGIS dev releases with Docker

We recently added support for GML curves to PostGIS, which enables TinyOWS to deliver WFS requests with curve geometries. More on this in a later post. This enhancement is in the current PostGIS developement version (SVN master) and not released yet. To enable our customer testing this functionality, we had to build packages for their server environment which is Ubuntu Precise with UbuntuGIS repositories. After working with Linux LXC containers and it's predecessor VServer for years, Docker was a logical choice for a clean reproducible build environment.

Rebuilding a Debian package is usually quite easy:

apt-get build-dep <package>
cd <packagedir>
#Make your changes
dch -i
dpkg-buildpackage

But getting build dependencies for PostGIS currently fails with libssl-dev conflicts, maybe because the dev packages got out of sync after the recent Heartblead updates. So the Dockerfile uses equivs to build a dummy package which satisfies the dependencies.

The command

docker run -v /tmp:/pkg sourcepole/postgis-svn-build-env sh -c 'cp /root/*postgis*.deb /pkg'

loads the Docker image with packages built from the latest SVN version of PostGIS in /root and copies the deb files from the containter into /tmp.

Now we're ready to install these packages on the Ubuntu server:

sudo dpkg -i /tmp/*postgis*.deb

Thats it. Feedback welcome!

@PirminKalberer

P.S.

If you happen to be a developer, then you may prefer running a cutting-edge version of PostGIS in a Docker container instead of building packages. Our colleagues from Oslandia just published how to do this.

Installing PostGIS on Fedora 20

In order to explore all the new interfaces to PostGIS (from QGIS, GDAL, GRASS GIS 7 and others) I decided to install PostGIS 2.1 on my Fedora 20 Linux box. Eventually it is an easy job but I had to visit a series of blogs to refresh my dark memories from past PostGIS installations done some years ago… So, here the few steps:

# become root
su -
# grab the PostgreSQL 9.3 server and PostGIS 2.1
yum install postgresql-server postgresql-contrib postgis

Now the server is installed but yet inactive and not configured. The next step is to initialize, configure and start the PostgreSQL server:

# initialize DB:
postgresql-setup initdb
# start at boot time:
chkconfig postgresql on
# fire up the daemon:
service postgresql start

A test connection will show that we need to configure TCP/IP connections:

# this will fail
psql -l
psql: could not connect to server: No such file or directory
    Is the server running locally and accepting
    connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?

So we enable TCP/IP connections listening on port 5432:

# edit "postgresql.conf" with editor of choice (nano, vim, ...),
# add line "listen_addresses = '*'":
vim /var/lib/pgsql/data/postgresql.conf
...
listen_addresses = '*'
#listen_addresses = 'localhost'         # what IP address(es) to listen on;
...

Now restart the PostgreSQL daemon:

# restart to re-read configuration
service postgresql restart

Check if the server is now listening on TCP/IP:

# check network connections:
netstat -l | grep postgres
tcp        0      0 0.0.0.0:postgres        0.0.0.0:*               LISTEN     
tcp6       0      0 [::]:postgres           [::]:*                  LISTEN     
unix  2      [ ACC ]     STREAM     LISTENING     2236495  /var/run/postgresql/.s.PGSQL.543

So far so nice. Still PostGIS is not yet active, and we need a database user “gisuser” with password:

# switch from root user to postgres user:
su - postgres
# create new DB user with password (will prompt you for it, choose a strong one):
createuser --pwprompt --encrypted gisuser

Finally we create a first database “gis”:

# create new DB
createdb --encoding=UTF8 --owner=gisuser gis

We enable it for PostGIS 2.1:

# insert PostGIS SQL magic (it should finish with a "COMMIT"):
psql -d gis -f /usr/share/pgsql/contrib/postgis-64.sql

That’s it! Now exit from the “postgres” user account a the “root” account:

# exit from PG user account (back to "root" account level):
exit

In case you want to reach the PostgreSQL/PostGIS server from outside your machine (i.e. from the network), you need to enable PostgreSQL for that:

# enable the network in pg_hba.conf (replace host line; perhaps comment IP6 line):
vim /var/lib/pgsql/data/pg_hba.conf

...
host    all             all             all                     md5
...

# save and restart the daemon:
service postgresql restart

Time for another connection test:

# we try our new DB user account on the new database (hostname is the 
# name of the server in the network):
psql --user gisuser -h hostname -l
Password for user gisuser: xxxxxx

Wonderful, we are connected!

# exit from root account:
exit
[neteler@oboe ] $

Now have our PostGIS database ready!

What’s left? Get some spatial data in as a normal user:

# nice tool
shp2pgsql-gui
shp2pgsql-gui

Using the shp2pgsql-gui

Next pick a SHAPE file and upload it to PostGIS with “Import”.

Now connect to your PostGIS database with QGIS or GRASS GIS and enjoy!

See also: https://fedoraproject.org/wiki/PostgreSQL

The post Installing PostGIS on Fedora 20 appeared first on GFOSS Blog | GRASS GIS Courses.

OSGeo Code Sprint, Vienna

This is how OSGeo happens.  These are the folk who bring us a lot of that open-source geo-spatial goodness. You can follow the code sprint on Twitter using the hashtags #csprint and #viennacodesprint14

 


Two book recommendations

I recently finished reading two books which may be of interest to open-source GIS users – “PostGIS Cookbook” and “The PyQGIS Programmer’s Guide“, both of which I highly recommend:

PostGIS Cookbook

PostGIS CookbookI’ve been a fan of Stephen Mather’s blog for a while now, and have consistently found it to be a great source of trustworthy information and creative solutions to GIS problems. So when I first saw mention of his work on the PostGIS Cookbook I knew it would be a must-read for me. PostGIS is an essential part of my daily toolkit, and I’ll quickly devour any tutorial or guide which can lead me to better ways to put it to use. And that’s exactly what this book is! It’s full of tips and guides which has inspired me in a lot of techniques I’d never tried or even thought possible in PostGIS.

It’s important to point out that this book isn’t a training manual or beginner’s guide to PostGIS. It assumes readers are already familiar with using PostGIS and have a good understanding of GIS software in general. (If you’re looking for a book to start from scratch with PostGIS, PostGIS in Action is a better fit). I think that’s really what makes this book stand out though. There’s currently not a lot of books available covering PostGIS, and as far as I’m aware the PostGIS Cookbook is the only book available which is targeted to experienced PostGIS users.

Highlights for me are:

  • A great explanation and write up on optimised KNN filtering in PostGIS (something which often trips me up)
  • The detailed guide to topologically correct simplification of features
  • The exploration of PgRouting, which is a great introduction to PostGIS’ routing abilities
  • The “PostGIS and the web” chapter – I really wasn’t expecting this, but it’s quite eye opening (I’m going to have to do some digging into GeoDjango sometime)

The only criticism I have with this book is that it jumps around a lot between operating systems. While most of the code is provided for both Linux/OSX and Windows, there’s occasional examples which only have code for one specific operating system. It’s a little jarring and assumes the user is well versed in their particular operating system to workaround these omissions.

Overall, I strongly recommend the PostGIS Cookbook, and would consider it a must have for anyone serious about expanding their PostGIS abilities. (Also, looks like the publisher, Packt, have a two-for-one sale going at the moment, so it’s a good time to grab this title).

The PyQGIS Programmer’s Guide

The PyQGIS Programmer's GuideThe second book I’ve just finished reading is Gary Sherman’s “The PyQGIS Programmer’s Guide“. For those who are unaware, Gary was the original founder of QGIS back in 2002, so you can be confident that he knows exactly what he’s writing about. In The PyQGIS Programmer’s Guide  Gary has created an in-depth guide on how to get started with programming for QGIS using python. It takes readers all the way from simple scripts right through to developing QGIS plugins and standalone applications based on the QGIS API.

This book fills an important void in the literature available for QGIS. Previously, the PyQGIS Developer Cookbook was the only available guide for QGIS python scripting, and unfortunately it’s a little out-of-date now. PyQGIS scripting can be a steep learning curve and that’s why this book is so appreciated.

It would be valuable to have some python knowledge and experience prior to reading this book. While the “Python Basics” chapter quickly runs through an introduction to the language, the book makes no claims to be a comprehensive python tutorial. But if you’ve dabbled in the language before and have familiarity with the python way of doing things you’ll easily be able to follow along.

Highlights are:

  • The “Tips and Techniques” chapter, which is a great mini-reference for performing a range of common tasks in PyQGIS (including loading layers, changing symbol styles, editing feature attributes, etc).
  • A complete tutorial for creating a QGIS plugin
  • A guide to debugging PyQGIS code and plugins

I’d definitely recommend that anyone who wants to get started with PyQGIS start with Gary’s work – you’ll find it the perfect place to begin.

50th ICA-OSGeo Lab established at Fondazione Edmund Mach (FEM)

We are pleased to announce that the 50th ICA-OSGeo Lab has been established at the GIS and Remote Sensing Unit (Piattaforma GIS & Remote Sensing, PGIS), Research and Innovation Centre (CRI), Fondazione Edmund Mach (FEM), Italy. CRI is a multifaceted research organization established in 2008 under the umbrella of FEM, a private research foundation funded by the government of Autonomous Province of Trento. CRI focuses on studies and innovations in the fields of agriculture, nutrition, and environment, with the aim to generate new sharing knowledge and to contribute to economic growth, social development and the overall improvement of quality of life.

The mission of the PGIS unit is to develop and provide multi-scale approaches for the description of 2-, 3- and 4-dimensional biological systems and processes. Core activities of the unit include acquisition, processing and validation of geo-physical, ecological and spatial datasets collected within various research projects and monitoring activities, along with advanced scientific analysis and data management. These studies involve multi-decadal change analysis of various ecological and physical parameters from continental to landscape level using satellite imagery and other climatic layers. The lab focuses on the geostatistical analysis of such information layers, the creation and processing of indicators, and the production of ecological, landscape genetics, eco-epidemiological and physiological models. The team pursues actively the development of innovative methods and their implementation in a GIS framework including the time series analysis of proximal and remote sensing data.

The GIS and Remote Sensing Unit (PGIS) members strongly support the peer reviewed approach of Free and Open Source software development which is perfectly in line with academic research. PGIS contributes extensively to the open source software development in geospatial (main contributors to GRASS GIS), often collaborating with various other developers and researchers around the globe. In the new ICA-OSGeo lab at FEM international PhD students, university students and trainees are present.

PGIS is focused on knowledge dissemination of open source tools through a series of courses designed for specific user requirement (schools, universities, research institutes), blogs, workshops and conferences. Their recent publication in Trends in Ecology and Evolution underlines the need on using Free and Open Source Software (FOSS) for completely open science. Dr. Markus Neteler, who is leading the group since its formation, has two decades of experience in developing and promoting open source GIS software. Being founding member of the Open Source Geospatial Foundation (OSGeo.org, USA), he served on its board of directors from 2006-2011. Luca Delucchi, focal point and responsible person for the new ICA-OSGeo Lab is member of the board of directors of the Associazione Italiana per l’Informazione Geografica Libera (GFOSS.it, the Italian Local Chapter of OSGeo). He contributes to several Free and Open Source software and open data projects as developer and trainer.

Details about the GIS and Remote Sensing Unit at http://gis.cri.fmach.it/

Open Source Geospatial Foundation (OSGeo) is a not-for-profit organisation founded in 2006 whose mission is to support and promote the collaborative development of open source geospatial technologies and data.

International Cartographic Association (ICA) is the world authoritative body for cartography and GIScience. See also the new ICA-OSGeo Labs website.

Public transport isochrones with pgRouting

This post covers a simple approach to calculating isochrones in a public transport network using pgRouting and QGIS.

For this example, I’m using the public transport network of Vienna which is loaded into a pgRouting-enable database as network.publictransport. To create the routable network run:

select pgr_createTopology('network.publictransport', 0.0005, 'geom', 'id');

Note that the tolerance parameter 0.0005 (units are degrees) controls how far link start and end points can be apart and still be considered as the same topological network node.

To create a view with the network nodes run:

create or replace view network.publictransport_nodes as
select id, st_centroid(st_collect(pt)) as geom
from (
	(select source as id, st_startpoint(geom) as pt
	from network.publictransport
	) 
union
	(select target as id, st_endpoint(geom) as pt
	from network.publictransport
	) 
) as foo
group by id;

To calculate isochrones, we need a cost attribute for our network links. To calculate travel times for each link, I used speed averages: 15 km/h for buses and trams and 32km/h for metro lines (similar to data published by the city of Vienna).

alter table network.publictransport add column length_m integer;
update network.publictransport set length_m = st_length(st_transform(geom,31287));

alter table network.publictransport add column traveltime_min double precision;
update network.publictransport set traveltime_min = length_m  / 15000.0 * 60; -- average is 15 km/h
update network.publictransport set traveltime_min = length_m  / 32000.0 * 60 where "LTYP" = '4'; -- average metro is 32 km/h

That’s all the preparations we need. Next, we can already calculate our isochrone data using pgr_drivingdistance, e.g. for network node #1:

create or replace view network.temp as
 SELECT seq, id1 AS node, id2 AS edge, cost, geom
  FROM pgr_drivingdistance(
    'SELECT id, source, target, traveltime_min as cost FROM network.publictransport',
    1, 100000, false, false
  ) as di
  JOIN network.publictransport_nodes pt
  ON di.id1 = pt.id;

The resulting view contains all network nodes which are reachable within 100,000 cost units (which are minutes in our case).

Let’s load the view into QGIS to visualize the isochrones:

isochrone_publictransport_1

The trick is to use data-defined size to calculate the different walking circles around the public transport stops. For example, we can set up 10 minute isochrones which take into account how much time was used to travel by pubic transport and show how far we can get by walking in the time that is left:

1. We want to scale the circle radius to reflect the remaining time left to walk. Therefore, enable Scale diameter in Advanced | Size scale field:

scale_diameter

2. In the Simple marker properties change size units to Map units.
3. Go to data defined properties to set up the dynamic circle size.

datadefined_size

The expression makes sure that only nodes reachable within 10 minutes are displayed. Then it calculates the remaining time (10-"cost") and assumes that we can walk 100 meters per minute which is left. It additionally multiplies by 2 since we are scaling the diameter instead of the radius.

To calculate isochrones for different start nodes, we simply update the definition of the view network.temp.

While this approach certainly has it’s limitations, it’s a good place to start learning how to create isochrones. A better solution should take into account that it takes time to change between different lines. While preparing the network, more care should to be taken to ensure that possible exchange nodes are modeled correctly. Some network links might only be usable in one direction. Not to mention that there are time tables which could be accounted for ;)


pgRouting 2.0 for Windows quick guide

This post is a quick instruction for installing Postgres 9.2, PostGIS 2.0 and pgRouting 2.0.

  1. For Postgres, download the installer from enterprisedb.com.
  2. Run the installer. You’ll have to pick a superuser password – remember it, you’ll need it again soon.
  3. At the end of the installation process, allow the installer to start Stack Builder.
  4. In Stack Builder, select the Postgres 9.2 installation and install PostGIS from the list of available extensions.
  5. The PostGIS installation procedure will prompt you for the superuser password you picked before.
  6. I suggest letting the installer create a sample database We’ll need it later anyway.

Now for the pgRouting part:

  1. Download the pgRouting zip file for your system (32 or 64 bit) from Winnie.
  2. Unzip the file. It contains bin, lib and share folders as well as two text files.
  3. Copy these folders and files over to your Postgres installation. In my case C:\Program Files\PostgreSQL\9.2

Installation – done.

Next, fire up pgAdmin. If you allowed the PostGIS installer to create a sample database, you will find it named postgis20 or similar. Otherwise just create a new database using the PostGIS template database. You can enable pgRouting in a database using the following steps:

  1. In postgis20, execute the following query to create the pgrouting extension. This will add the pgRouting-specific functions:
    CREATE EXTENSION pgrouting;
  2. Test if everything worked fine:
    SELECT pgr_version();

    It should return "(2.0.0-dev,v2.0.0-beta,18,a3be38b,develop,1.46.1)" or similar – depending on the version you downloaded.

pgadmin_pgrouting

How about some test data? I’ll be using the public transport network of the city of Vienna provided in GeoJSON format from http://data.wien.gv.at/daten/wfs?service=WFS&request=GetFeature&version=1.1.0&typeName=ogdwien:OEFFLINIENOGD&srsName=EPSG:4326&outputFormat=json:

  1. Just copy paste the url in Add Vector Layer | Protocol to load the dataset.
  2. Use DB Manager to load the layer into your database. (As you can see in the screenshot, I created a schema called network but that’s optional.)
  3. import_publictransport

  4. To make the line vector table routable, we use pgr_createTopology. This function assumes the columsn called “source” and “target” exist so we have to create those as well:
    alter table network.publictransport add column source integer;
    alter table network.publictransport add column target integer;
    select pgr_createTopology('network.publictransport', 0.0001, 'geom', 'id');
    

    I’m quite generously using a tolerance of 0.0001 degrees to build the topology. Depending on your dataset, you might want to be more strict here.

  5. Let’s test it! Route from source #1 to target #3000 using pgr_dijkstra:
    SELECT seq, id1 AS node, id2 AS edge, cost, geom
      FROM pgr_dijkstra(
        'SELECT id, source, target, st_length(geom) as cost FROM network.publictransport',
        1, 3000, false, false
      ) as di
      JOIN network.publictransport pt
      ON di.id2 = pt.id ;

    Note how the query joins the routing results and the network table together. (I’m aware that using the link length as a cost attribute will not lead to realistic results in a public transport network but bear with me for this example.)

  6. We can immediately see the routing results using the Load as new layer option:

select_dijkstra
route

Nice work! pgRouting 2.0 has come a long way. In a post from April this year, Boston GIS even announced to add pgRouting into the Stack Builder. That’s going to make the installation even more smooth.


WFS to PostGIS in 1 Step

This is an update to my previous post “WFS to PostGIS in 3 Steps”. Thanks to Even Rouault’s comments and improvements to GDAL, it is now possible to load Latin1-encoded WFS (like the one by data.wien.gv.at) into PostGIS in just one simple step.

To use the following instructions, you’ll have to get the latest GDAL (release-1600-gdal-mapserver.zip)

You only need to run SDKShell.bat to set up the environment and ogr2ogr is ready for action:

C:\Users\Anita>cd C:\release-1600-gdal-mapserver
C:\release-1600-gdal-mapserver>SDKShell.bat
C:\release-1600-gdal-mapserver>ogr2ogr -overwrite -f PostgreSQL PG:"user=myuser password=mypassword dbname=wien_ogd" "WFS:http://data.wien.gv.at/daten/geoserver/ows?service=WFS&request=GetFeature&version=1.1.0&typeName=ogdwien:BEZIRKSGRENZEOGD&srsName=EPSG:4326"

Thanks everyone for your comments and help!


WFS to PostGIS in 3 Steps

This is a quick note on how to download features from a WFS and import them into a PostGIS database. The first line downloads a zipped Shapefile from the WFS. The second one unzips it and the last one loads the data into my existing “gis_experimental” database:

wget "http://data.wien.gv.at/daten/geoserver/ows?service=WFS&request=GetFeature&version=1.1.0&typeName=ogdwien:BEZIRKSGRENZEOGD&srsName=EPSG:4326&outputFormat=shape-zip" -O BEZIRKSGRENZEOGD.zip
unzip -d /tmp BEZIRKSGRENZEOGD.zip
shp2pgsql -s 4326 -I -S -c -W Latin1 "/tmp/BEZIRKSGRENZEOGD.shp" | psql gis_experimental

Now, I’d just need a loop through the WFS Capabilities to automatically fetch all offered layers … Ideas anyone?

Thanks to Tim for his post “Batch importing shapefiles into PostGIS” which was very useful here.

Update: Many readers have pointed out that ogr2ogr is a great tool for this kind of use cases and can do the above in one line. That’s true – if it works. Unfortunately, it is picky about the supported encodings, e.g. doesn’t want to parse ISO-8859-15. In such cases, the three code lines above can be a good alternative.


Kursprogramm Herbst 2012

Sourcepole bietet Grundlagen- und Aufbau-Kurse für den Betrieb von Geodaten-Infrastrukturen auf der Basis von PostgreSQL/PostGIS und Quantum GIS an. Detaillierte Informationen zu den Kursen, die im Herbst 2012 stattfinden, entnehmen Sie bitte dem Kursprogramm. Die Anmeldung ist ab sofort online möglich. Wir freuen uns darauf Sie in Zürich begüssen zu können.

Installing PostGIS 2.0 on ubuntu

PostGIS 2.0 is out and the awesomness continues! You can install PostGIS 2.0 on Ubuntu using packages which is exactly what I am going to show you here. Read on for details on how to get up and running and do your first simple raster analysis!

Note: You should make good backups first!

Before we begin, you should uninstall your existing postgis packages:

sudo dpkg --purge postgis postgresql-9.1-postgis

Then add a new repository and install PostGIS from there (based on this post):

sudo apt-add-repository ppa:sharpie/for-science  # To get GEOS 3.3.2
sudo apt-add-repository ppa:sharpie/postgis-nightly
sudo apt-get update
sudo apt-get install postgresql-9.1-postgis

Next we should create a new template database (optional but recommended).

createdb -E UTF8 template_postgis2
createlang -d template_postgis2 plpgsql
psql -d postgres -c "UPDATE pg_database SET datistemplate='true' WHERE datname='template_postgis2'"
psql -d template_postgis2 -f /usr/share/postgresql/9.1/contrib/postgis-2.0/postgis.sql
psql -d template_postgis2 -f /usr/share/postgresql/9.1/contrib/postgis-2.0/spatial_ref_sys.sql
psql -d template_postgis2 -f /usr/share/postgresql/9.1/contrib/postgis-2.0/rtpostgis.sql
psql -d template_postgis2 -c "GRANT ALL ON geometry_columns TO PUBLIC;"
psql -d template_postgis2 -c "GRANT ALL ON geography_columns TO PUBLIC;"
psql -d template_postgis2 -c "GRANT ALL ON spatial_ref_sys TO PUBLIC;"
createdb training -T template_postgis2

Ok now we can load a raster (see sample data download below):

raster2pgsql -s 4326 srtm_4326.tif | psql training
shp2pgsql -s 4326 -d -g geom -I places.shp places| psql training

Good - now our spatial database is ready to use - and has raster support! Here is a nice example of what you can do. The query looks up the altitude from the SRTM raster for each place listed using the ST_Value function:

select ST_Value(rast, geom, true) from places, srtm_4326;

It should produce something like this:

Doing a 'point on raster' query on our raster in QGIS

Further reading: A really good place to start is the Boston GIS cheatsheets - I am really looking forward to exploring all the new features that are available in PostGIS 2.0, thanks to all those involved in building it!

Sample data for the example listed

  • <<
  • Page 2 of 4 ( 64 posts )
  • >>
  • postgis

Back to Top

Sustaining Members