Related Plugins and Tags

QGIS Planet

OSM data quality assessment: producing map to illustrate data quality

At Oslandia, we like working with Open Source tool projects and handling Open (geospatial) Data. In this article series, we will play with the OpenStreetMap (OSM) map and subsequent data. Here comes the eighth article of this series, dedicated to the OSM data quality evaluation, through production of new maps.

1 Description of OSM element

 1.1 Element metadata extraction

As mentionned in a previous article dedicated to metadata extraction, we have to focus on element metadata itself if we want to produce valuable information about quality. The first questions to answer here are straightforward: what is an OSM element? and how to extract its associated metadata?. This part is relatively similar to the job already done with users.

We know from previous analysis that an element is created during a changeset by a given contributor, may be modified several times by whoever, and may be deleted as well. This kind of object may be either a “node”, a “way” or a “relation”. We also know that there may be a set of different tags associated with the element. Of course the list of every operations associated to each element is recorded in the OSM data history. Let’s consider data around Bordeaux, as in previous blog posts:

import pandas as pd
elements = pd.read_table('../src/data/output-extracts/bordeaux-metropole/bordeaux-metropole-elements.csv', parse_dates=['ts'], index_col=0, sep=",")
elements.head().T
   elem        id  version  visible         ts    uid  chgset
0  node  21457126        2    False 2008-01-17  24281  653744
1  node  21457126        3    False 2008-01-17  24281  653744
2  node  21457126        4    False 2008-01-17  24281  653744
3  node  21457126        5    False 2008-01-17  24281  653744
4  node  21457126        6    False 2008-01-17  24281  653744

This short description helps us to identify some basic features, which are built in the following snippets. First we recover the temporal features:

elem_md = (elements.groupby(['elem', 'id'])['ts']
            .agg(["min", "max"])
            .reset_index())
elem_md.columns = ['elem', 'id', 'first_at', 'last_at']
elem_md['lifespan'] = (elem_md.last_at - elem_md.first_at)/pd.Timedelta('1D')
extraction_date = elements.ts.max()
elem_md['n_days_since_creation'] = ((extraction_date - elem_md.first_at)
                                  / pd.Timedelta('1d'))
elem_md['n_days_of_activity'] = (elements
                              .groupby(['elem', 'id'])['ts']
                              .nunique()
                              .reset_index())['ts']
elem_md = elem_md.sort_values(by=['first_at'])
                                    213418
elem                                  node
id                               922827508
first_at               2010-09-23 00:00:00
last_at                2010-09-23 00:00:00
lifespan                                 0
n_days_since_creation                 2341
n_days_of_activity                       1

Then the remainder of the variables, e.g. how many versions, contributors, changesets per elements:

    elem_md['version'] = (elements.groupby(['elem','id'])['version']
                          .max()
                          .reset_index())['version']
    elem_md['n_chgset'] = (elements.groupby(['elem', 'id'])['chgset']
                           .nunique()
                           .reset_index())['chgset']
    elem_md['n_user'] = (elements.groupby(['elem', 'id'])['uid']
                         .nunique()
                         .reset_index())['uid']
    osmelem_last_user = (elements
                         .groupby(['elem','id'])['uid']
                         .last()
                         .reset_index())
    osmelem_last_user = osmelem_last_user.rename(columns={'uid':'last_uid'})
    elements = pd.merge(elements, osmelem_last_user,
                       on=['elem', 'id'])
    elem_md = pd.merge(elem_md,
                       elements[['elem', 'id', 'version', 'visible', 'last_uid']],
                       on=['elem', 'id', 'version'])
    elem_md = elem_md.set_index(['elem', 'id'])
    elem_md.sample().T
elem                                  node
id                              1340445266
first_at               2011-06-26 00:00:00
last_at                2011-06-27 00:00:00
lifespan                                 1
n_days_since_creation                 2065
n_days_of_activity                       2
version                                  2
n_chgset                                 2
n_user                                   1
visible                              False
last_uid                            354363

As an illustration we have above an old two-versionned node, no more visible on the OSM website.

1.2 Characterize OSM elements with user classification

This set of features is only descriptive, we have to add more information to be able to characterize OSM data quality. That is the moment to exploit the user classification produced in the last blog post!

As a recall, we hypothesized that clustering the users permits to evaluate their trustworthiness as OSM contributors. They are either beginners, or intermediate users, or even OSM experts, according to previous classification.

Each OSM entity may have received one or more contributions by users of each group. Let’s say the entity quality is good if its last contributor is experienced. That leads us to classify the OSM entities themselves in return!

How to include this information into element metadata?

We first need to recover the results of our clustering process.

user_groups = pd.read_hdf("../src/data/output-extracts/bordeaux-metropole/bordeaux-metropole-user-kmeans.h5", "/individuals")
user_groups.head()
           PC1       PC2       PC3       PC4       PC5       PC6  Xclust
uid                                                                     
1626 -0.035154  1.607427  0.399929 -0.808851 -0.152308 -0.753506       2
1399 -0.295486 -0.743364  0.149797 -1.252119  0.128276 -0.292328       0
2488  0.003268  1.073443  0.738236 -0.534716 -0.489454 -0.333533       2
5657 -0.889706  0.986024  0.442302 -1.046582 -0.118883 -0.408223       4
3980 -0.115455 -0.373598  0.906908  0.252670  0.207824 -0.575960       5

As a remark, there were several important results to save after the clustering process; we decided to serialize them into a single binary file. Pandas knows how to manage such file, that would be a pity not to take advantage of it!

We recover the individuals groups in the eponym binary file tab (column Xclust), and only have to join it to element metadata as follows:

elem_md = elem_md.join(user_groups.Xclust, on='last_uid')
elem_md = elem_md.rename(columns={'Xclust':'last_uid_group'})
elem_md.reset_index().to_csv("../src/data/output-extracts/bordeaux-metropole/bordeaux-metropole-element-metadata.csv")
elem_md.sample().T
elem                                  node
id                              1530907753
first_at               2011-12-04 00:00:00
last_at                2011-12-04 00:00:00
lifespan                                 0
n_days_since_creation                 1904
n_days_of_activity                       1
version                                  1
n_chgset                                 1
n_user                                   1
visible                               True
last_uid                             37548
last_uid_group                           2

From now, we can use the last contributor cluster as an additional information to generate maps, so as to study data quality…

Wait… There miss another information, isn’t it? Well yes, maybe the most important one, when dealing with geospatial data: the location itself!

1.3 Recover the geometry information

Even if Pyosmium library is able to retrieve OSM element geometries, we realized some tests with an other OSM data parser here: osm2pgsql.

We can recover geometries from standard OSM data with this tool, by assuming the existence of an osm database, owned by user:

osm2pgsql -E 27572 -d osm -U user -p bordeaux_metropole --hstore ../src/data/raw/bordeaux-metropole.osm.pbf

We specify a France-focused SRID (27572), and a prefix for naming output databases point, line, polygon and roads.

We can work with the line subset, that contains the physical roads, among other structures (it roughly corresponds to the OSM ways), and build an enriched version of element metadata, with geometries.

First we can create the table bordeaux_metropole_geomelements, that will contain our metadata…

DROP TABLE IF EXISTS bordeaux_metropole_elements;
DROP TABLE IF EXISTS bordeaux_metropole_geomelements;
CREATE TABLE bordeaux_metropole_elements(
       id int,
       elem varchar,
       osm_id bigint,
       first_at varchar,
       last_at varchar,
       lifespan float,
       n_days_since_creation float,
       n_days_of_activity float,
       version int,
       n_chgsets int,
       n_users int,
       visible boolean,
       last_uid int,
       last_user_group int
);

…then, populate it with the data accurate .csv file…

COPY bordeaux_metropole_elements
FROM '/home/rde/data/osm-history/output-extracts/bordeaux-metropole/bordeaux-metropole-element-metadata.csv'
WITH(FORMAT CSV, HEADER, QUOTE '"');

…and finally, merge the metadata with the data gathered with osm2pgsql, that contains geometries.

SELECT l.osm_id, h.lifespan, h.n_days_since_creation,
h.version, h.visible, h.n_users, h.n_chgsets,
h.last_user_group, l.way AS geom
INTO bordeaux_metropole_geomelements
FROM bordeaux_metropole_elements as h
INNER JOIN bordeaux_metropole_line as l
ON h.osm_id = l.osm_id AND h.version = l.osm_version
WHERE l.highway IS NOT NULL AND h.elem = 'way'
ORDER BY l.osm_id;

Wow, this is wonderful, we have everything we need in order to produce new maps, so let’s do it!

2 Keep it visual, man!

From the last developments and some hypothesis about element quality, we are able to produce some customized maps. If each OSM entities (e.g. roads) can be characterized, then we can draw quality maps by highlighting the most trustworthy entities, as well as those with which we have to stay cautious.

In this post we will continue to focus on roads within the Bordeaux area. The different maps will be produced with the help of Qgis.

2.1 First step: simple metadata plotting

As a first insight on OSM elements, we can plot each OSM ways regarding simple features like the number of users who have contributed, the number of version or the element anteriority.

Figure 1: Number of active contributors per OSM way in Bordeaux

 

Figure 2: Number of versions per OSM way in Bordeaux

With the first two maps, we see that the ring around Bordeaux is the most intensively modified part of the road network: more unique contributors are implied in the way completion, and more versions are designed for each element. Some major roads within the city center present the same characteristics.

Figure 3: Anteriority of each OSM way in Bordeaux, in years

If we consider the anteriority of OSM roads, we have a different but interesting insight of the area. The oldest roads are mainly located within the city center, even if there are some exceptions. It is also interesting to notice that some spatial patterns arise with temporality: entire neighborhoods are mapped within the same anteriority.

2.2 More complex: OSM data merging with alternative geospatial representations

To go deeper into the mapping analysis, we can use the INSEE carroyed data, that divides France into 200-meter squared tiles. As a corollary OSM element statistics may be aggregated into each tile, to produce additional maps. Unfortunately an information loss will occur, as such tiles are only defined where people lives. However it can provides an interesting alternative illustration.

To exploit such new data set, we have to merge the previous table with the accurate INSEE table. Creating indexes on them is of great interest before running such a merging operation:

CREATE INDEX insee_geom_gist
ON open_data.insee_200_carreau USING GIST(wkb_geometry);
CREATE INDEX osm_geom_gist
ON bordeaux_metropole_geomelements USING GIST(geom);

DROP TABLE IF EXISTS bordeaux_metropole_carroyed_ways;
CREATE TABLE bordeaux_metropole_carroyed_ways AS (
SELECT insee.ogc_fid, count(*) AS nb_ways,
avg(bm.version) AS avg_version, avg(bm.lifespan) AS avg_lifespan,
avg(bm.n_days_since_creation) AS avg_anteriority,
avg(bm.n_users) AS avg_n_users, avg(bm.n_chgsets) AS avg_n_chgsets,
insee.wkb_geometry AS geom
FROM open_data.insee_200_carreau AS insee
JOIN bordeaux_metropole_geomelements AS bm
ON ST_Intersects(insee.wkb_geometry, bm.geom)
GROUP BY insee.ogc_fid
);

As a consequence, we get only 5468 individuals (tiles), a quantity that must be compared to the 29427 roads previously handled… This operation will also simplify the map analysis!

We can propose another version of previous maps by using Qgis, let’s consider the average number of contributors per OSM roads, for each tile:

Figure 4: Number of contributors per OSM roads, aggregated by INSEE tile

2.3 The cherry on the cake: representation of OSM elements with respect to quality

Last but not least, the information about last user cluster can shed some light on OSM data quality: by plotting each roads according to the last user who has contributed, we might identify questionable OSM elements!

We simply have to design similar map than in previous section, with user classification information:

Figure 5: OSM roads around Bordeaux, according to the last user cluster (1: C1, relation experts; 2: C0, versatile expert contributors; 3: C4, recent one-shot way contributors; 4: C3, old one-shot way contributors; 5: C5, locally-unexperienced way specialists)

According to the clustering done in the previous article (be careful, the legend is not the same here…), we can make some additional hypothesis:

  • Light-blue roads are OK, they correspond to the most trustful cluster of contributors (91.4% of roads in this example)
  • There is no group-0 road (group 0 corresponds to cluster C2 in the previous article)… And that’s comforting! It seems that “untrustworthy” users do not contribute to roads or -more probably- that their contributions are quickly amended.
  • Other contributions are made by intermediate users: a finer analysis should be undertaken to decide if the corresponding elements are valid. For now, we can consider everything is OK, even if local patterns seem strong. Areas of interest should be verified (they are not necessarily of low quality!)

For sure, it gives a fairly new picture of OSM data quality!

3 Conclusion

In this last article, we have designed new maps on a small area, starting from element metadata. You have seen the conclusion of our analysis: characterizing the OSM data quality starting from the user contribution history.

Of course some works still have to be done, however we detailed a whole methodology to tackle the problem. We hope you will be able to reproduce it, and to design your own maps!

Feel free to contact us if you are interested in this topic!

Speeding up your PyQGIS scripts

I’ve recently spent some time optimising the performance of various QGIS plugins and algorithms, and I’ve noticed that there’s a few common performance traps which developers fall into when fetching features from a vector layer. In this post I’m going to explore these traps, what makes them slow, and how to avoid them.

As a bit of background, features are fetched from a vector layer in QGIS using a QgsFeatureRequest object. Common use is something like this:

request = QgsFeatureRequest()
for feature in vector_layer.getFeatures(request):
    # do something

This code would iterate over all the features in layer. Filtering the features is done by tweaking the QgsFeatureRequest, such as:

request = QgsFeatureRequest().setFilterFid(1001)
feature_1001 = next(vector_layer.getFeatures(request))

In this case calling getFeatures(request) just returns the single feature with an ID of 1001 (which is why we shortcut and use next(…) here instead of iterating over the results).

Now, here’s the trap: calling getFeatures is expensive. If you call it on a vector layer, QGIS will be required to setup an new connection to the data store (the layer provider), create some query to return data, and parse each result as it is returned from the provider. This can be slow, especially if you’re working with some type of remote layer, such as a PostGIS table over a VPN connection. This brings us to our first trap:

Trap #1: Minimise the calls to getFeatures()

A common task in PyQGIS code is to take a list of feature IDs and then request those features from the layer. A see a lot of older code which does this using something like:

for id in some_list_of_feature_ids:
    request = QgsFeatureRequest().setFilterFid(id)
    feature = next(vector_layer.getFeatures(request))
    # do something with the feature

Why is this a bad idea? Well, remember that every time you call getFeatures() QGIS needs to do a whole bunch of things before it can start giving you the matching features. In this case, the code is calling getFeatures() once for every feature ID in the list. So if the list had 100 features, that means QGIS is having to create a connection to the data source, set up and prepare a query to match a single feature, wait for the provider to process that, and then finally parse the single feature result. That’s a lot of wasted processing!

If the code is rewritten to take the call to getFeatures() outside of the loop, then the result is:

request = QgsFeatureRequest().setFilterFids(some_list_of_feature_ids)
for feature in vector_layer.getFeatures(request):
    # do something with the feature

Now there’s just a single call to getFeatures() here. QGIS optimises this request by using a single connection to the data source, preparing the query just once, and fetching the results in appropriately sized batches. The difference is huge, especially if you’re dealing with a large number of features.

Trap #2: Use QgsFeatureRequest filters appropriately

Here’s another common mistake I see in PyQGIS code. I often see this one when an author is trying to do something with all the selected features in a layer:

for feature in vector_layer.getFeatures():
    if not feature.id() in vector_layer.selectedFeaturesIds():
        continue

    # do something with the feature

What’s happening here is that the code is iterating over all the features in the layer, and then skipping over any which aren’t in the list of selected features. See the problem here? This code iterates over EVERY feature in the layer. If you’re layer has 10 million features, we are fetching every one of these from the data source, going through all the work of parsing it into a QGIS feature, and then promptly discarding it if it’s not in our list of selected features. It’s very inefficient, especially if fetching features is slow (such as when connecting to a remote database source).

Instead, this code should use the setFilterFids() method for QgsFeatureRequest:

request = QgsFeatureRequest().setFilterFids(vector_layer.selectedFeaturesIds())
for feature in vector_layer.getFeatures(request):
    # do something with the feature

Now, QGIS will only fetch features from the provider with matching feature IDs from the list. Instead of fetching and processing every feature in the layer, only the actual selected features will be fetched. It’s not uncommon to see operations which previously took many minutes (or hours!) drop down to a few seconds after applying this fix.

Another variant of this trap uses expressions to test the returned features:

filter_expression = QgsExpression('my_field > 20')
for feature in vector_layer.getFeatures():
    if not filter_expression.evaluate(feature):
        continue

    # do something with the feature

Again, this code is fetching every single feature from the layer and then discarding it if it doesn’t match the “my_field > 20” filter expression. By rewriting this to:

request = QgsFeatureRequest().setFilterExpression('my_field > 20')
for feature in vector_layer.getFeatures(request):
    # do something with the feature

we hand over the bulk of the filtering to the data source itself. Recent QGIS versions intelligently translate the filter into a format which can be applied directly at the provider, meaning that any relevant indexes and other optimisations can be applied by the provider itself. In this case the rewritten code means that ONLY the features matching the ‘my_field > 20’ criteria are fetched from the provider – there’s no time wasted messing around with features we don’t need.

 

Trap #3: Only request values you need

The last trap I often see is that more values are requested from the layer then are actually required. Let’s take the code:

my_sum = 0
for feature in vector_layer.getFeatures(request):
    my_sum += feature['value']

In this case there’s no way we can optimise the filters applied, since we need to process every feature in the layer. But – this code is still inefficient. By default QGIS will fetch all the details for a feature from the provider. This includes all attribute values and the feature’s geometry. That’s a lot of processing – QGIS needs to transform the values from their original format into a format usable by QGIS, and the feature’s geometry needs to be parsed from it’s original type and rebuilt as a QgsGeometry object. In our sample code above we aren’t doing anything with the geometry, and we are only using a single attribute from the layer. By calling setFlags( QgsFeatureRequest.NoGeometry ) and setSubsetOfAttributes() we can tell QGIS that we don’t need the geometry, and we only require a single attribute’s value:

my_sum = 0
request = QgsFeatureRequest().setFlags(QgsFeatureRequest.NoGeometry).setSubsetOfAttributes(['value'], vector_layer.fields() )
for feature in vector_layer.getFeatures(request):
    my_sum += feature['value']

None of the unnecessary geometry parsing will occur, and only the ‘value’ attribute will be fetched and populated in the features. This cuts down both on the processing required AND the amount of data transfer between the layer’s provider and QGIS. It’s a significant improvement if you’re dealing with larger layers.

Conclusion

Optimising your feature requests is one of the easiest ways to speed up your PyQGIS script! It’s worth spending some time looking over all your uses of getFeatures() to see whether you can cut down on what you’re requesting – the results can often be mind blowing!

The road to QGIS 3.0 – part 1

qgis_icon.svgAs we discussed in QGIS 3 is under way, the QGIS project is working toward the next major version of the application and these developments have major impact on any custom scripts or plugins you’ve developed for QGIS.

We’re now just over a week into this work, and already there’s been tons of API breaking changes landing the code base. In this post we’ll explore some of these changes, what’s motivated them, and what they mean for your scripts.

The best source for keeping track of these breaking changes is to watch the API break documentation on GitHub. This file is updated whenever a change lands which potentially breaks plugins/scripts, and will eventually become a low-level guide to porting plugins to QGIS 3.0.

API clean-ups

So far, lots of the changes which have landed have related to cleaning up the existing API. These include:

Removal of deprecated API calls

The API has been frozen since QGIS 2.0 was released in 2013, and in the years since then many things have changed. As a result, different parts of the API were deprecated along the way as newer, better ways of doing things were introduced. The deprecated code was left intact so that QGIS 2.x plugins would still all function correctly. By removing these older, deprecated code paths it enables the QGIS developers to streamline the code, remove hacky workarounds, untested methods, and just generally “clean things up”. As an example, the older labelling system which pre-dates QGIS 2.0 (it had no collision detection, no curved labels, no fancy data defined properties or rule based labelling!) was still floating around just in case someone tried to open a QGIS 1.8 project. That’s all gone now, culling over 5000 lines of outdated, unmaintained code. Chances are this won’t affect your plugins in the slightest. Other removals, like the removal of QgsMapRenderer (the renderer used before multi-threaded rendering was introduced) likely have a much larger impact, as many scripts and plugins were still using QgsMapRenderer classes and calls. These all need to be migrated to the new QgsMapRendererJob and QgsMapSettings classes.

Renaming things for consistency

Consistent naming helps keep the API predictable and more user friendly. Lots of changes have landed so far to make the naming of classes and methods more consistent. These include things like:

  • Making sure names use consistent capitalization. Eg, there was previously methods named “writeXML” and “writeXml”. These have all been renamed to consistently use camel case, including for acronyms. (In case you’re wondering – this convention is used to follow the Qt library conventions).
  • Consistent use of terms. The API previously used a mix of “CRS” and “SRS” for similar purposes – it now consistently uses “CRS” for a coordinate reference system.
  • Removal of abbreviations. Lots of abbreviated words have been removed from the names, eg “destCrs” has become “destinationCrs”. The API wasn’t consistently using the same abbreviations (ie “dest”/”dst”/”destination”), so it was decided to remove all use of abbreviated words and replace them with the full word. This helps keep things predictable, and is also a bit friendlier for non-native English speakers.

The naming changes all need to be addressed to make existing scripts and plugins compatible with QGIS 3.0. It’s potentially quite a lot of work for plugin developers, but in the long term it will make the API easier to use.

Changes to return and argument types

There’s also been lots of changes relating to the types of objects returned by functions, or the types of objects used as function arguments. Most of these involve changing the c++ types from pointers to references, or from references to copies. These changes are being made to strengthen the API and avoid potential crashes. In most cases they don’t have any affect on PyQGIS code, with some exceptions:

  • Don’t pass Python “None” objects as QgsCoordinateReferenceSystems or as QgsCoordinateTransforms. In QGIS 3.0 you must pass invalid QgsCoordinateReferenceSystem objects (“QgsCoordinateReferenceSystem()”) or invalid QgsCoordinateTransform (“QgsCoordinateTransform()”) objects instead.

Transparent caching of CRS creation

The existing QgsCRSCache class has been removed. This class was used to cache the expensive results of initializing a QgsCoordinateReferenceSystem object, so that creating the same CRS could be done instantly and avoid slow databases lookups. In QGIS 3.0 this caching is now handled transparently, so there is no longer a need for the separate QgsCRSCache and it has been removed. If you were using QgsCRSCache in your PyQGIS code, it will need to be removed and replaced with the standard QgsCoordinateReferenceSystem constructors.

This change has the benefit that many existing plugins which were not explicitly using QgsCRSCache will now gain the benefits of the faster caching mechanism – potentially this could dramatically speed up existing plugin algorithms.

In summary

The QGIS developers have been busy fixing, improving and cleaning up the PyQGIS API. We recognise that these changes result in significant work for plugin and script developers, so we’re committed to providing quality documentation for how to adapt your code for these changes, and we will also investigate the use of automated tools to help ease your code transition to QGIS 3.0. We aren’t making changes lightly, but instead are carefully refining the API to make it more predictable, streamlined and stable.

If you’d like assistance with (or to outsource) the transition of your existing QGIS scripts and plugins to QGIS 3.0, just contact us at North Road to discuss. Every day we’re directly involved in the changes moving to QGIS 3.0, so we’re ideally placed to make this transition painless for you!

QGIS 3 is underway – what does it mean for your plugins and scripts?

With the imminent release of QGIS 2.16, the development attention has now shifted to the next scheduled release – QGIS 3.0! If you haven’t been following the discussion surrounding this I’m going to try and summarise what exactly 3.0 means and how it will impact any scripts or plugins you’ve developed for QGIS.

qgis_icon.svgQGIS 3.0 is the first major QGIS release since 2.0 was released way back in September 2013. Since that release so much has changed in QGIS… a quick glance over the release notes for 2.14 shows that even for this single point release there’s been hundreds of changes. Despite this, for all 2.x releases the PyQGIS API has remained stable, and a plugin or script which was developed for use in QGIS 2.0 will still work in QGIS 2.16.

Version 3.0 will introduce the first PyQGIS API break since 2013. An API break like this is required to move QGIS to newer libraries such as Qt 5 and Python 3, and allows the development team the flexibility to tackle long-standing issues and limitations which cannot be fixed using the 2.x API. Unfortunately, the side effect of this API break is that the scripts and plugins which you use in QGIS 2.x will no longer work when QGIS 3.0 is released!

Numerous API breaking changes have already started to flow into QGIS, and 2.16 isn’t even yet publicly available. The best way to track these changes is to keep an eye on the “API changes” documentation.  This document describes all the changes which are flowing in which affect PyQGIS code, and describe how best they should be addressed by plugin and script maintainers. Some changes are quite trivial and easy to update code for, others are more extreme (such as changes surrounding moving to PyQt5 and Python 3) and may require significant time to adapt for.

I’d encourage all plugin and script developers to keep watching the API break documentation, and subscribe to the developers list for additional information about required changes as they are introduced.

If you’re looking for assistance or to outsource adaptation of your plugins and scripts to QGIS 3.0 – the team at North Road are ideally placed to assist! Our team includes some of the most experienced QGIS developers who are directly involved with the development of QGIS 3.0, so you can be confident knowing that your code is in good hands. Just contact us to discuss your QGIS development requirements.

You can read more about QGIS 3.0 API changes in The road to QGIS 3.0 – part 1.

A new QGIS plugin allows dynamic filtering of values in forms

   

This plugin has been partially funded (50%) by ARPA Piemonte.

Description

This is a core-enhancement QGIS plugin that makes the implementation of complex dynamic filters in QGIS attribute forms an easy task. For example, this widget can be used to implement drill-down forms, where the values available in one field depend on the values of other fields.

Download

The plugin is available on the official QGIS Python Plugin Repository and the source code is on GitHub QGIS Form Value Relation plugin repository

Implementation

The new “Form Value Relation” widget is essentially a clone of the core “Value Relation” widget with some important differences: When the widget is created:
  • the whole unfiltered features of the related layer are loaded and cached
  • the form values of all the attributes are added to the context (see below)
  • the filtering against the expression happens every time the widget is refreshed
  • a signal is bound to the form changes and if the changed field is present in the filter expression, the features are filtered against the expression and the widget is refreshed

Using form values in the expression

A new expression function is available (in the “Custom” section):
CurrentFormValue('FIELD_NAME')
This function returns the current value of a field in the editor form.

Note

  1. This function can only be used inside forms and it’s particularly useful when used together with the custom widget `Form Value Relation`
  2. If the field does not exists the function returns an empty string.

Visual guide

  Download the example project.   This is the new widget in action: changing the field FK_PROV, the ISTAT values are filtered according to the filter expression.
The new widget in action

The new widget drill-down in action

layer_config_fields

Choosing the new widget

Configuring the widget

Configuring the widget

Configuring the expression

Configuring the expression to read FK_PROV value from the form

QGIS developer meeting in Nødebo

During the hackfest I’ve been working on the refactoring of the server component, aimed to wrap the server into a class and create python bindings for the new classes. This work is now in the PR queue and brings a first working python test for the server itself.

The server can now be invoked directly from python, like in the example below:

 

#!/usr/bin/env python
"""
Super simple QgsServer.
"""

from qgis.server import *
from BaseHTTPServer import *

class handler (BaseHTTPRequestHandler):

    server = QgsServer()

    def _doHeaders(self, response):
        l = response.pop(0)
        while l:
            h = l.split(':')
            self.send_header(h[0], ':'.join(h[1:]))
            self.log_message( "send_header %s - %s" % (h[0], ':'.join(h[1:])))
            l = response.pop(0)
        self.end_headers()

    def do_HEAD(self):
        self.send_response(200)
        response = str(handler.server.handleRequestGetHeaders(self.path[2:])).split('\n')
        self._doHeaders(response)

    def do_GET(self):
        response = str(handler.server.handleRequest(self.path[2:])).split('\n')
        i = 0
        self.send_response(200)
        self._doHeaders(response)
        self.wfile.write(('\n'.join(response[i:])).strip())

    def do_OPTIONS(s):
        handler.do_GET(s)

httpd = HTTPServer( ('', 8000), handler)

while True:
    httpd.handle_request()

The python bindings capture the server output instead of printing it on FCGI stdout and allow to pass the request parameters QUERY_STRING directly to the request handler as a string, this makes writing python tests very easy.

How to read a raster cell with Python QGIS and GDAL

QGIS and GDAL both have Python bindings, you can use both libraries to read a value from a raster cell, since QGIS uses GDAL libraries under the hood, we can expect to read the exact same value with both systems.

 

Here is a short example about how to do it with the two different approaches, we assume that you are working inside the QGIS python console and the project has a raster file loaded, but with just a few modifications, the example can also be run from a standard python console.

The example raster layer is a DTM with 1000 cells width and 2000 cells height, we want to read the value at the cell with coordinates x = 500 and y = 1000.

# First layer in QGIS project is a DTM 2 bands raster
from osgeo import gdal
# You need this to convert raw values readings from GDAL
import struct

# Read the cell with this raster coordinates
x = 500
y = 1000

# Get the map layer registry
reg = QgsMapLayerRegistry.instance()

# Get the first layer (the DTM raster)
qgis_layer = reg.mapLayers().values()[0]

# Open the raster with GDAL
gdal_layer = gdal.Open(rlayer.source())

"""
Fetches the coefficients for transforming between pixel/line (P,L) raster space, 
and projection coordinates (Xp,Yp) space.
    Xp = padfTransform[0] + P*padfTransform[1] + L*padfTransform[2];
    Yp = padfTransform[3] + P*padfTransform[4] + L*padfTransform[5];
In a north up image, padfTransform[1] is the pixel width, and padfTransform[5] 
is the pixel height. The upper left corner of the upper left pixel is 
at position (padfTransform[0],padfTransform[3]).
"""
gt = gldal_layer.GetGeoTransform()

# o:origin, r:rotation, s:size
xo, xs, xr, yo, yr, ys = gt

# Read band 1 at the middle of the raster ( x = 500, y = 1000)
band = gdal_layer.GetRasterBand(1)
gdal_value = struct.unpack('f', band.ReadRaster(x, y, 1, 1, buf_type=band.DataType))[0]

xcoo = xo + xs * x + xr * y
ycoo = yo + yr * x + ys * y

# Read the value with QGIS, we must pass the map coordinates
# and the exact extent = 1 cell size
qgis_value = qgis_layer.dataProvider().identify(QgsPoint(xcoo, ycoo), \
    QgsRaster.IdentifyFormatValue, \
    theExtent=QgsRectangle( xcoo , ycoo, xcoo + xs, ycoo + ys) )\
    .results()[1]

assert(gdal_value == qgis_value)

Python SIP C++ bindings tutorial

Since QGIS uses QT libraries, SIP is the natural choice for creating the bindings.

Here are some random notes about this journey into SIP and Python bindings, I hope you’ll find them useful!
We will create a sample C++ library, a simple C++ program to test it and finally, the SIP configuration file and the python module plus a short program to test it.

Create the example library

FIrst we need a C++ library, following  the tutorial on the official SIP website  I created a simple library named hellosip:

 

$ mkdir hellosip
$ cd hellosip
$ touch hellosip.h hellosip.cpp Makefile.lib

This is the content of the header file hellosip.h:

#include <string>

using namespace std;

class HelloSip {
    const string the_word;
public:
    // ctor
    HelloSip(const string w);
    string reverse() const;
};

This is the implementation in file hellosip.cpp , the library just reverse a string, nothing really useful.

#include "hellosip.h"
#include <string>

HelloSip::HelloSip(const string w): the_word(w)
{
}

string HelloSip::reverse() const
{
    string tmp;
    for (string::const_reverse_iterator rit=the_word.rbegin(); rit!=the_word.rend(); ++rit)
        tmp += *rit;
    return tmp;
}

 

Compiling and linking the shared library

Now, its time to compile the library, g++ must be invoked with -fPIC option in order to generate Position Independent Code, -g tells the compiler to generate debug symbols and it is not strictly necessary if you don’t need to debug the library:

g++ -c -g -fPIC hellosip.cpp -o hellosip.o

The linker needs a few options to create a dynamically linked Shared Object (.so) library, first -shared which tells gcc to create a shared library, then the -soname which is the library version name, last -export_dynamic that is also not strictly necessary but can be useful for debugging in case the library is dynamically opened (with dlopen) :

g++ -shared -Wl,-soname,libhellosip.so.1  -g -export-dynamic -o libhellosip.so.1  hellosip.o

At the end of this process, we should have a brand new libhellosip.so.1 sitting in the current directory.

For more informations on shared libraries under linux you can read TLDP chapter on this topic.

 

Using the library with C++

Before starting the binding creation with SIP, we want to test the new library with a simple C++ program stored in a new cpp file: hellosiptest.cpp:

#include "hellosip.h"
#include <string>
using namespace std;
// Prints True if the string is correctly reversed
int main(int argc, char* argv[]) {
  HelloSip hs("ciao");
  cout << ("oaic" == hs.reverse() ? "True" : "False") << endl;
  return 0;
}

To compile the program we use the simple command:

g++ hellosiptest.cpp -g -L.  -lhellosip -o hellosiptest

which fails with the following error:

/usr/bin/ld: cannot find -lhellosip
collect2: error: ld returned 1 exit status

For this tutorial, we are skipping the installation part, that would have created proper links from the base soname, we are doing it now with:

ln -s libhellosip.so.1 libhellosip.so

The compiler should now be happy and produce an hellosiptest executable, that can be tested with:

$ ./hellosiptest
True

If we launch the program we might see a new error:

./hellosiptest: error while loading shared libraries: libhellosip.so.1: cannot open shared object file: No such file or directory

This is due to the fact that we have not installed our test library system-wide and the operating system is not able to locate and dynamically load the library, we can fix it in the current shell by adding the current path to the LD_LIBRARY_PATH environment variable which tells the operating system which directories have to be searched for shared libraries. The following commands will do just that:

export LD_LIBRARY_PATH=`pwd`

Note that this environment variable setting is “temporary” and will be lost when you exit the current shell.

 

 

SIP bindings

Now that we know that the library works we can start with the bindings, SIP needs an interface header file with the instructions to create the bindings, its syntax resembles that of a standard C header file with the addition of a few directives, it contains (among other bits) the name of the module and the classes and methods to export.

The SIP header file hellosip.sip contains two blocks of instructions: the class definition that ends around line 15 and an additional %MappedType block that specifies how the std::string type can be translated from/to Python objects, this block is not normally necessary until you stick standard C types. You will notice that the class definition part is quite similar to the C++ header file hellosip.h:

// Define the SIP wrapper to the hellosip library.

%Module hellosip

class HelloSip {

%TypeHeaderCode
#include <hellosip.h>
%End

public:
    HelloSip(const std::string w);
    std::string reverse() const;
};

// Creates the mapping for std::string
// From: http://www.riverbankcomputing.com/pipermail/pyqt/2009-July/023533.html

%MappedType std::string
{
%TypeHeaderCode
#include 
%End

%ConvertFromTypeCode
    // convert an std::string to a Python (unicode) string
    PyObject* newstring;
    newstring = PyUnicode_DecodeUTF8(sipCpp->c_str(), sipCpp->length(), NULL);
    if(newstring == NULL) {
        PyErr_Clear();
        newstring = PyString_FromString(sipCpp->c_str());
    }
    return newstring;
%End

%ConvertToTypeCode
    // Allow a Python string (or a unicode string) whenever a string is
    // expected.
    // If argument is a Unicode string, just decode it to UTF-8
    // If argument is a Python string, assume it's UTF-8
    if (sipIsErr == NULL)
        return (PyString_Check(sipPy) || PyUnicode_Check(sipPy));
    if (sipPy == Py_None) {
        *sipCppPtr = new std::string;
        return 1;
    }
    if (PyUnicode_Check(sipPy)) {
        PyObject* s = PyUnicode_AsEncodedString(sipPy, "UTF-8", "");
        *sipCppPtr = new std::string(PyString_AS_STRING(s));
        Py_DECREF(s);
        return 1;
    }
    if (PyString_Check(sipPy)) {
        *sipCppPtr = new std::string(PyString_AS_STRING(sipPy));
        return 1;
    }
    return 0;
%End
};

At this point we could have run the sip command by hand but the documentation suggests to use the python module sipconfig that, given a few of configuration variables, automatically creates the Makefile for us, the file is by convention named configure.py:

import os
import sipconfig

basename = "hellosip"

# The name of the SIP build file generated by SIP and used by the build
# system.
build_file = basename + ".sbf"

# Get the SIP configuration information.
config = sipconfig.Configuration()

# Run SIP to generate the code.
os.system(" ".join([config.sip_bin, "-c", ".", "-b", build_file, basename + ".sip"]))

# Create the Makefile.
makefile = sipconfig.SIPModuleMakefile(config, build_file)

# Add the library we are wrapping.  The name doesn't include any platform
# specific prefixes or extensions (e.g. the "lib" prefix on UNIX, or the
# ".dll" extension on Windows).
makefile.extra_libs = [basename]

# Search libraries in current directory
makefile.extra_lflags= ['-L.']

# Generate the Makefile itself.
makefile.generate()

We now have a Makefile ready to build the bindings, just run make to build the library. If everything goes right you will find a new hellosip.so library which is the python module. To test it, we can use the following simple program (always make sure that LD_LIBRARY_PATH contains the directory where libhellosip.so is found).

import hellosip
print hellosip.HelloSip('ciao').reverse() == 'oaic'

Download

The full source code of this tutorial can be downloaded from this link.

QGIS server python plugins tutorial

This is the second article about python plugins for QGIS server, see also the introductory article posted a few days ago.

In this post I will introduce the helloServer example plugin that shows some common implementation patterns exploiting the new QGIS Server Python Bindings API.

Server plugins and desktop interfaces

Server plugins can optionally have a desktop interface exactly like all standard QGIS plugins.

A typical use case for a server plugin that also has a desktop interface is to allow the users to configure the server-side of the plugin from QGIS desktop, this is the same principle of configuring WMS/WFS services of QGIS server from the project properties.

The only important difference it that while the WMS/WFS services configuration is stored in the project file itself, the plugins can store and access project data but not to the user’s settings (because the server process normally runs with a different user). For this reason, if you want to share configuration settings between the server and the desktop, provided that you normally run the server with a different user, paths and permissions have to be carefully configured to grant both users access to the shared data.

 

Server configuration

This is an example configuration for Apache, it covers both FCGI and CGI:

  ServerAdmin webmaster@localhost
  # Add an entry to your /etc/hosts file for xxx localhost e.g.
  # 127.0.0.1 xxx
  ServerName xxx
    # Longer timeout for WPS... default = 40
    FcgidIOTimeout 120 
    FcgidInitialEnv LC_ALL "en_US.UTF-8"
    FcgidInitialEnv PYTHONIOENCODING UTF-8
    FcgidInitialEnv LANG "en_US.UTF-8"
    FcgidInitialEnv QGIS_DEBUG 1
    FcgidInitialEnv QGIS_CUSTOM_CONFIG_PATH "/home/xxx/.qgis2/"
    FcgidInitialEnv QGIS_SERVER_LOG_FILE /tmp/qgis.log
    FcgidInitialEnv QGIS_SERVER_LOG_LEVEL 0
    FcgidInitialEnv QGIS_OPTIONS_PATH "/home/xxx/public_html/cgi-bin/"
    FcgidInitialEnv QGIS_PLUGINPATH "/home/xxx/.qgis2/python/plugins"
    FcgidInitialEnv LD_LIBRARY_PATH "/home/xxx/apps/lib"

    # For simple CGI: ignored by fcgid
    SetEnv QGIS_DEBUG 1
    SetEnv QGIS_CUSTOM_CONFIG_PATH "/home/xxx/.qgis2/"
    SetEnv QGIS_SERVER_LOG_FILE /tmp/qgis.log 
    SetEnv QGIS_SERVER_LOG_LEVEL 0
    SetEnv QGIS_OPTIONS_PATH "/home/xxx/public_html/cgi-bin/"
    SetEnv QGIS_PLUGINPATH "/home/xxx/.qgis2/python/plugins"
    SetEnv LD_LIBRARY_PATH "/home/xxx/apps/lib"

    RewriteEngine On
    
        RewriteCond %{HTTP:Authorization} .
        RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
    

  ScriptAlias /cgi-bin/ /home/xxx/apps/bin/
  <Directory "/home/xxx/apps/bin/">
    AllowOverride All
    Options +ExecCGI -MultiViews +FollowSymLinks
    Require all granted  

  ErrorLog ${APACHE_LOG_DIR}/xxx-error.log
  CustomLog ${APACHE_LOG_DIR}/xxx-access.log combined


In this particular example, I’m using a QGIS server built from sources and installed in /home/xxx/apps/bin the libraries are in /home/xxx/apps/lib and LD_LIBRARY_PATH poins to this location.
QGIS_CUSTOM_CONFIG_PATH tells the server where to search for QGIS configuration (for example qgis.db).
QGIS_PLUGINPATH is searched for plugins as start, your server plugins must sit in this directory, while developing you can choose to use the same directory of your QGIS desktop installation.
QGIS_DEBUG set to 1 to enable debug and logging.

Anatomy of a server plugin

For a plugin to be seen as a server plugin, it must provide correct metadata informations and a factory method:

Plugin metadata

A server enabled plugins must advertise itself as a server plugin by adding the line

server=True

in its metadata.txt file.

The serverClassFactory method

A server enabled plugins is basically just a standard QGIS Python plugins that provides a serverClassFactory(serverIface) function in its __init__.py. This function is invoked once when the server starts to generate the plugin instance (it’s called on each request if running in CGI mode: not recommended) and returns a plugin instance:

def serverClassFactory(serverIface):
    from HelloServer import HelloServerServer
    return HelloServerServer(serverIface)

You’ll notice that this is the same pattern we have in “traditional” QGIS plugins.

Server Filters

A server plugin typically consists in one or more callbacks packed into objects called QgsServerFilter.

Each QgsServerFilter implements one or all of the following callbacks:

The following example implements a minimal filter which prints HelloServer! in case the SERVICE parameter equals to “HELLO”.

from qgis.server import *
from qgis.core import *

class HelloFilter(QgsServerFilter):

    def __init__(self, serverIface):
        super(HelloFilter, self).__init__(serverIface)    

    def responseComplete(self):        
        request = self.serverInterface().requestHandler()
        params = request.parameterMap()
        if params.get('SERVICE', '').upper() == 'HELLO':
            request.clearHeaders()
            request.setHeader('Content-type', 'text/plain')
            request.clearBody()
            request.appendBody('HelloServer!')

The filters must be registered into the serverIface as in the following example:

class HelloServerServer:
    def __init__(self, serverIface):
        # Save reference to the QGIS server interface
        self.serverIface = serverIface
        serverIface.registerFilter( HelloFilter, 100 )          

The second parameter of registerFilter allows to set a priority which defines the order for the callbacks with the same name (the lower priority is invoked first).

Full control over the flow

By using the three callbacks, plugins can manipulate the input and/or the output of the server in many different ways. In every moment, the plugin instance has access to the QgsRequestHandler through the QgsServerInterface, the QgsRequestHandler has plenty of methods that can be used to alter the input parameters before entering the core processing of the server (by using requestReady) or after the request has been processed by the core services (by using sendResponse).

The following examples cover some common use cases:

Modifying the input

The example plugin contains a test example that changes input parameters coming from the query string, in this example a new parameter is injected into the (already parsed) parameterMap, this parameter is then visible by core services (WMS etc.), at the end of core services processing we check that the parameter is still there.

from qgis.server import *
from qgis.core import *

class ParamsFilter(QgsServerFilter):

    def __init__(self, serverIface):
        super(ParamsFilter, self).__init__(serverIface)

    def requestReady(self):
        request = self.serverInterface().requestHandler()
        params = request.parameterMap( )
        request.setParameter('TEST_NEW_PARAM', 'ParamsFilter')

    def responseComplete(self):
        request = self.serverInterface().requestHandler()
        params = request.parameterMap( )
        if params.get('TEST_NEW_PARAM') == 'ParamsFilter':
            QgsMessageLog.logMessage("SUCCESS - ParamsFilter.responseComplete", 'plugin', QgsMessageLog.INFO)
        else:
            QgsMessageLog.logMessage("FAIL    - ParamsFilter.responseComplete", 'plugin', QgsMessageLog.CRITICAL)

This is an extract of what you see in the log file:

src/core/qgsmessagelog.cpp: 45: (logMessage) [0ms] 2014-12-12T12:39:29 plugin[0] HelloServerServer - loading filter ParamsFilter
src/core/qgsmessagelog.cpp: 45: (logMessage) [1ms] 2014-12-12T12:39:29 Server[0] Server plugin HelloServer loaded!
src/core/qgsmessagelog.cpp: 45: (logMessage) [0ms] 2014-12-12T12:39:29 Server[0] Server python plugins loaded
src/mapserver/qgsgetrequesthandler.cpp: 35: (parseInput) [0ms] query string is: SERVICE=HELLO&request=GetOutput
src/mapserver/qgshttprequesthandler.cpp: 547: (requestStringToParameterMap) [1ms] inserting pair SERVICE // HELLO into the parameter map
src/mapserver/qgshttprequesthandler.cpp: 547: (requestStringToParameterMap) [0ms] inserting pair REQUEST // GetOutput into the parameter map
src/mapserver/qgsserverfilter.cpp: 42: (requestReady) [0ms] QgsServerFilter plugin default requestReady called
src/core/qgsmessagelog.cpp: 45: (logMessage) [0ms] 2014-12-12T12:39:29 plugin[0] HelloFilter.requestReady
src/mapserver/qgis_map_serv.cpp: 235: (configPath) [0ms] Using default configuration file path: /home/xxx/apps/bin/admin.sld
src/mapserver/qgshttprequesthandler.cpp: 49: (setHttpResponse) [0ms] Checking byte array is ok to set...
src/mapserver/qgshttprequesthandler.cpp: 59: (setHttpResponse) [0ms] Byte array looks good, setting response...
src/core/qgsmessagelog.cpp: 45: (logMessage) [0ms] 2014-12-12T12:39:29 plugin[0] HelloFilter.responseComplete
src/core/qgsmessagelog.cpp: 45: (logMessage) [0ms] 2014-12-12T12:39:29 plugin[0] SUCCESS - ParamsFilter.responseComplete
src/core/qgsmessagelog.cpp: 45: (logMessage) [0ms] 2014-12-12T12:39:29 plugin[0] RemoteConsoleFilter.responseComplete
src/mapserver/qgshttprequesthandler.cpp: 158: (sendResponse) [0ms] Sending HTTP response
src/core/qgsmessagelog.cpp: 45: (logMessage) [0ms] 2014-12-12T12:39:29 plugin[0] HelloFilter.sendResponse

On line 13 the “SUCCESS” string indicates that the plugin passed the test.

The same technique can be exploited to use a custom service instead of a core one: you could for example skip a WFS SERVICE request or any other core request just by changing the SERVICE parameter to something different and the core service will be skipped, then you can inject your custom results into the output and send them to the client (this is explained here below).

Changing or replacing the output

The watermark filter example shows how to replace the WMS output with a new image obtained by adding a watermark image on the top of the WMS image generated by the WMS core service:

import os

from qgis.server import *
from qgis.core import *
from PyQt4.QtCore import *
from PyQt4.QtGui import *


class WatermarkFilter(QgsServerFilter):

    def __init__(self, serverIface):
        super(WatermarkFilter, self).__init__(serverIface)

    def responseComplete(self):
        request = self.serverInterface().requestHandler()
        params = request.parameterMap( )
        # Do some checks
        if (request.parameter('SERVICE').upper() == 'WMS' \
                and request.parameter('REQUEST').upper() == 'GETMAP' \
                and not request.exceptionRaised() ):
            QgsMessageLog.logMessage("WatermarkFilter.responseComplete: image ready %s" % request.infoFormat(), 'plugin', QgsMessageLog.INFO)
            # Get the image
            img = QImage()
            img.loadFromData(request.body())
            # Adds the watermark
            watermark = QImage(os.path.join(os.path.dirname(__file__), 'media/watermark.png'))
            p = QPainter(img)
            p.drawImage(QRect( 20, 20, 40, 40), watermark)
            p.end()
            ba = QByteArray()
            buffer = QBuffer(ba)
            buffer.open(QIODevice.WriteOnly)
            img.save(buffer, "PNG")
            # Set the body
            request.clearBody()
            request.appendBody(ba)

In this example the SERVICE parameter value is checked and if the incoming request is a WMS GETMAP and no exceptions have been set by a previously executed plugin or by the core service (WMS in this case), the WMS generated image is retrieved from the output buffer and the watermark image is added. The final step is to clear the output buffer and replace it with the newly generated image. Please note that in a real-world situation we should also check for the requested image type instead of returning PNG in any case.

The power of python

The examples above are just meant to explain how to interact with QGIS server python bindings but server plugins have full access to all QGIS python bindings and to thousands of python libraries, what you can do with python server plugins is just limited by your imagination!

 

See all QGIS Server related posts

QGIS and IPython: the definitive interactive console

Whatever is your level of Python knowledge, when you’ll discover the advantages and super-powers of IPython you will never run the default python console again, really: never!

If you’ve never heard about IPython, discover it on IPython official website, don’t get confused by its notebook, graphics and parallel computing capabilities, it also worth if only used as a substitute for the standard Python shell.

I discovered IPython more than 5 years ago and it literally changed my life: I use it also for debugging instead ofpdb, you can embed an IPython console in your code with:

from IPython import embed; embed()

TAB completion with full introspection

What I like the most in IPython is its TAB completion features, it’s not just like normal text matching while you type but it has full realtime introspection, you only see what you have access to, being it a method of an instance or a class or a property, a module, a submodule or whatever you might think of: it even works when you’re importing something or you are typing a path like in open('/home/.....

Its TAB completion is so powerful that you can even use shell commands from within the IPython interpreter!

Full documentation is just a question mark away

Just type “?” after a method of function to print its docstring or its signature in case of SIP bindings.

Lot of special functions

IPython special functions are available for history, paste, run, include and many more topics, they are prefixed with “%” and self-documented in the shell.

All that sounds great! But what has to do with QGIS?

I personally find the QGIS python console lacks some important features, expecially with the autocompletion (autosuggest). What’s the purpose of having autocompletion when most of the times you just get a traceback because the method the autocompleter proposed you is that of another class? My brain is too small and too old to keep the whole API docs in my mind, autocompletion is useful when it’s intelligent enough to tell between methods and properties of the instance/class on which you’re operating.

Another problem is that the API is very far from being “pythonic” (this isn’t anyone’s fault, it’s just how SIP works), here’s an example (suppose we want the SRID of the first layer):

core.QgsMapLayerRegistry.instance().mapLayers().value()[0].crs().authid()
# TAB completion stops working here^

TAB completion stop working at the first parenthesis :(

What if all those getter would be properties?

registry = core.QgsMapLayerRegistry.instance()
# With a couple of TABs without having to remember any method or function name!
registry.p_mapLayers.values()
[<qgis._core.QgsRasterLayer at 0x7f07dff8e2b0>,
 <qgis._core.QgsRasterLayer at 0x7f07dff8ef28>,
 <qgis._core.QgsVectorLayer at 0x7f07dff48c30>,
 <qgis._core.QgsVectorLayer at 0x7f07dff8e478>,
 <qgis._core.QgsVectorLayer at 0x7f07dff489d0>,
 <qgis._core.QgsVectorLayer at 0x7f07dff48770>]

layer = registry.p_mapLayers.values()[0]

layer.p_c ---> TAB!
layer.p_cacheImage            layer.p_children       layer.p_connect       
layer.p_capabilitiesString    layer.p_commitChanges  layer.p_crs           
layer.p_changeAttributeValue  layer.p_commitErrors   layer.p_customProperty

layer.p_crs.p_ ---> TAB!
layer.p_crs.p_authid               layer.p_crs.p_postgisSrid      
layer.p_crs.p_axisInverted         layer.p_crs.p_projectionAcronym
layer.p_crs.p_description          layer.p_crs.p_recentProjections
layer.p_crs.p_ellipsoidAcronym     layer.p_crs.p_srsid            
layer.p_crs.p_findMatchingProj     layer.p_crs.p_syncDb           
layer.p_crs.p_geographicCRSAuthId  layer.p_crs.p_toProj4          
layer.p_crs.p_geographicFlag       layer.p_crs.p_toWkt            
layer.p_crs.p_isValid              layer.p_crs.p_validationHint   
layer.p_crs.p_mapUnits    

layer.p_crs.p_authid
Out[]: u'EPSG:4326'

This works with a quick and dirty hack: propertize that adds a p_... property to all methods in a module or in a class that

  1. do return something
  2. do not take any argument (except self)

this leaves the original methods untouched (in case they were overloaded!) still allowing full introspection and TAB completion with a pythonic interface.

A few methods are still not working with propertize, so far singleton methods like instance() are not passing unit tests.

IPyConsole: a QGIS IPython plugin

If you’ve been reading up to this point you probably can’t wait to start using IPython inside your beloved QGIS (if that’s not the case, please keep reading the previous paragraphs carefully until your appetite is grown!).

An experimental plugin that brings the magic of IPython to QGIS is now available:
Download IPyConsole

 

Please start exploring QGIS objects and classes and give me some feedback!

 

IPyConsole QGIS plugin

Installation notes

You basically need only a working IPython installation, IPython is available for all major platforms and distributions, please refer to the official documentation.

 

Accessing composer item properties via custom expressions in QGIS

So here is a neat trick. Lets say you wanted to access the scale of a composer map to make it part of a label. The scale bar can already be set to numeric to show the number value but what if it needs to be part of an existing label with other text. Not to fear, expression functions are here.

  • Create a new composer. Add the map frame and a label.
  • Set the item ID of the map frame to something you can remember, lets just use themap
  • Select the label and add some text
  • Click Insert Expression

Now for the cool part

  • Select Function Editor
  • Click New File. Give the file a new name and hit save. I called it composer functions.

In the code editor paste this code:

from qgis.utils import iface
from qgis.core import *
from qgis.gui import *

@qgsfunction(args="auto", group='Composer')
def composeritemattr(composername, mapname, attrname, feature, parent):
    composers = iface.activeComposers()
    # Find the composer with the given name
    comp = [composer.composition() for composer in composers 
                if composer.composerWindow().windowTitle() == composername][0]
    # Find the item
    item = comp.getComposerItemById(mapname)
    # Get the attr by name and call 
    return getattr(item, attrname)()
  • Click Run Script

run

Now in your label use this text:

Scale: [% composeritemattr('Composer 1', 'themap', 'scale')%]

Update the Composer 1 to match your composer name, and the themap to match your item ID.

and like magic here is the scale from the map item in a label:

2015-05-21 22_00_09-Composer 1

Check the expression error section if the label doesn’t render

error


Filed under: Open Source, qgis Tagged: composer, python, qgis

PSA: Please use new style Qt signals and slots not the old style

Don’t do this:

self.connect(self.widget, 
             SIGNAL("valueChanged(int)"), 
             self.valuechanged)

It’s the old way, the crappy way. It’s prone to error and typing mistakes. And who really wants to be typing strings as functions and arg names in it. Gross.

Do this:

self.widget.valueChanged.connect(self.valuechanged)
self.widget.valueChanged[str].connect(self.valuechanged)

Much nicer. Cleaner. Looks and feels like Python not some mash up between C++ and Python. The int argument is the default so it will use that. If you to pick the signal type you can use [type].

Don’t do this:

self.emit(SIGNAL("changed()", value1, value2))

Do this

class MyType(QObject):
   changed = pyqtSignal(str, int)

   def stuff(self):
       self.changed.emit(value1, value2)

pyqtSignal is a type you can use to define you signal. It will come with type checking, if you don’t want type checking just do pyqtSignal(object).

Please think of the poor kittens before using the old style in your code.


Filed under: pyqt, python, qgis Tagged: pyqt, qgis, qt

A interactive command bar for QGIS

Something that has been on my mind for a long time is a interactive command interface for QGIS.  Something that you can easily open, run simple commands, and is interactive to ask for arguments when they are needed.

After using the command interface in Emacs for a little bit over the weekend – you can almost hear the Boos! from heavy Vim users :) – I thought this is something I must have in QGIS as well.  I’m sure it can’t be that hard to add.

So here it is.  A interactive command interface for QGIS.

commandbar

commandbar2

The command bar plugin (find it in the plugin installer) adds a simple interactive command bar to QGIS. Commands are defined as Python code and may take arguments.

Here is an example function:

@command.command("Name")
def load_project(name):
    """
    Load a project from the set project paths
    """
    _name = name
    name += ".qgs"
    for path in project_paths:
        for root, dirs, files in os.walk(path):
            if name in files:
                path = os.path.join(root, name)
                iface.addProject(path)
                return
    iface.addProject(_name)

All functions are interactive and if not all arguments are given when called it will prompt for each one.

Here is an example of calling the point-at function with no args. It will ask for the x and then the y

pointat

Here is calling point-at with all the args

pointatfunc

Functions can be called in the command bar like so:

my-function arg1 arg2 arg2

The command bar will split the line based on space and the first argument is always the function name, the rest are arguments passed to the function. You will also note that it will convert _ to - which is easier to type and looks nicer.

The command bar also has auto complete for defined functions – and tooltips once I get that to work correctly.

You can use CTRL + ; (CTRL + Semicolon), or CTRL + ,, to open and close the command bar.

What is a command interface without auto complete

autocomplete

Use Enter to select the item in the list.

How about a function to hide all the dock panels. Sure why not.

@command.command()
def hide_docks():
    docks = iface.mainWindow().findChildren(QDockWidget)
    for dock in docks:
        dock.setVisible(False)

alias command

You can also alias a function by calling the alias function in the command bar.

The alias command format is alias {name} {function} {args}

Here is an example of predefining the x for point-at as mypoint

-> alias mypoint point-at 100

point-at is a built in function that creates a point at x y however we can alias it so that it will be pre-called with the x argument set. Now when we call mypoint we only have to pass the y each time.

-> mypoint
(point-at) What is the Y?: 200

You can even alias the alias command – because why the heck not :)

-> alias a alias
a mypoint 100

a is now the shortcut hand for alias

WHY U NO USE PYTHON CONSOLE

The Python console is fine and dandy but we are not going for a full programming language here, that isn’t the point. The point is easy to use commands.

You could have a function called point_at in Python that would be

point_at(123,1331)

Handling incomplete functions is a lot harder because of the Python parser. In the end it’s easier and better IMO to just make a simple DSL for this and get all the power of a DSL then try and fit into Python.

It should also be noted that the commands defined in the plugin can still be called like normal Python functions because there is no magic there. The command bar is just a DSL wrapper around them.

Notes

This is still a bit of an experiment for me so things might change or things might not work as full expected just yet.

Check out the projects readme for more info on things that need to be done, open to suggestions and pull requests.

Also see the docs page for more in depth information


Filed under: Open Source, python, qgis Tagged: plugin, pyqgis, qgis

Map corner coordinates in QGIS

EN | PT

Some time ago in the qgis-pt mailing list, someone asked how to show the coordinates of a map corners using QGIS. Since this features wasn’t available (yet), I have tried to reach a automatic solution, but without success,  After some thought about it and after reading a blog post by Nathan Woodrow, it came to me that the solution could be creating a user defined function for the expression builder to be used in labels in the map.

Closely following the blog post instructions, I have created a file called userfunctions.py in the  .qgis2/python folder and, with a help from Nyall Dawson I wrote the following code.

from qgis.utils import qgsfunction, iface
from qgis.core import QGis

@qgsfunction(2,"python")
def map_x_min(values, feature, parent):
 """
 Returns the minimum x coordinate of a map from
 a specific composer.
 """
 composer_title = values[0]
 map_id = values[1]
 composers = iface.activeComposers()
 for composer_view in composers():
  composer_window = composer_view.composerWindow()
  window_title = composer_window.windowTitle()
  if window_title == composer_title:
   composition = composer_view.composition()
   map = composition.getComposerMapById(map_id)
   if map:
    extent = map.currentMapExtent()
    break
 result = extent.xMinimum()
 return result

After running the command import userfunctions in the python console  (Plugins > Python Console), it was already possible to use the  map_x_min() function (from the python category) in an expression to get the minimum X value of the map.

Screenshot from 2014-09-09 16^%29^%29

All I needed now was to create the other three functions,  map_x_max(), map_y_min() and map_y_max().  Since part of the code would be repeated, I have decided to put it in a function called map_bound(), that would use the print composer title and map id as arguments, and return the map extent (in the form of a QgsRectangle).

from qgis.utils import qgsfunction, iface
from qgis.core import QGis

def map_bounds(composer_title, map_id):
 """
 Returns a rectangle with the bounds of a map
 from a specific composer
 """
 composers = iface.activeComposers()
 for composer_view in composers:
  composer_window = composer_view.composerWindow()
  window_title = composer_window.windowTitle()
  if window_title == composer_title:
   composition = composer_view.composition()
   map = composition.getComposerMapById(map_id)
   if map:
    extent = map.currentMapExtent()
    break
 else:
  extent = None

 return extent

With this function available, I could now use it in the other functions to obtain the map X and Y minimum and maximum values, making the code more clear and easy to maintain. I also add some mechanisms to the original code to prevent errors.

@qgsfunction(2,"python")
def map_x_min(values, feature, parent):
 """
 Returns the minimum x coordinate of a map from a specific composer.
 Calculations are in the Spatial Reference System of the project.
<h2>Syntax</h2>
map_x_min(composer_title, map_id)
<h2>Arguments</h2>
composer_title - is string. The title of the composer where the map is.
 map_id - integer. The id of the map.
<h2>Example</h2>
map_x_min('my pretty map', 0) -> -12345.679

 """
 composer_title = values[0]
 map_id = values[1]
 map_extent = map_bounds(composer_title, map_id)
 if map_extent:
  result = map_extent.xMinimum()
 else:
  result = None

 return result

@qgsfunction(2,"python")
def map_x_max(values, feature, parent):
 """
 Returns the maximum x coordinate of a map from a specific composer.
 Calculations are in the Spatial Reference System of the project.
<h2>Syntax</h2>
map_x_max(composer_title, map_id)
<h2>Arguments</h2>
composer_title - is string. The title of the composer where the map is.
 map_id - integer. The id of the map.
<h2>Example</h2>
map_x_max('my pretty map', 0) -> 12345.679

 """
 composer_title = values[0]
 map_id = values[1]
 map_extent = map_bounds(composer_title, map_id)
 if map_extent:
  result = map_extent.xMaximum()
 else:
  result = None

 return result

@qgsfunction(2,"python")
def map_y_min(values, feature, parent):
 """
 Returns the minimum y coordinate of a map from a specific composer.
 Calculations are in the Spatial Reference System of the project.
<h2>Syntax</h2>
map_y_min(composer_title, map_id)
<h2>Arguments</h2>
composer_title - is string. The title of the composer where the map is.
 map_id - integer. The id of the map.
<h2>Example</h2>
map_y_min('my pretty map', 0) -> -12345.679

 """
 composer_title = values[0]
 map_id = values[1]
 map_extent = map_bounds(composer_title, map_id)
 if map_extent:
  result = map_extent.yMinimum()
 else:
  result = None

 return result

@qgsfunction(2,"python")
def map_y_max(values, feature, parent):
 """
 Returns the maximum y coordinate of a map from a specific composer.
 Calculations are in the Spatial Reference System of the project.
<h2>Syntax</h2>
map_y_max(composer_title, map_id)
<h2>Arguments</h2>
composer_title - is string. The title of the composer where the map is.
 map_id - integer. The id of the map.
<h2>Example</h2>
map_y_max('my pretty map', 0) -> 12345.679

 """
 composer_title = values[0]
 map_id = values[1]
 map_extent = map_bounds(composer_title, map_id)
 if map_extent:
  result = map_extent.yMaximum()
 else:
  result = None

 return result

The functions became available to the expression builder in the “Python” category (could have been any other name) and the functions descriptions are formatted as help texts to provide the user all the information needed to use them.

Screenshot from 2014-09-09 15^%39^%19

Using the created functions, it was now easy to put the corner coordinates in labels near the map corners. Any change to the map extents is reflected in the label, therefore quite useful to use with the atlas mode.

Screenshot from 2014-09-09 15^%40^%27

The functions result can be used with other functions. In the following image, there is an expression to show the coordinates in a more compact way.

Screenshot from 2014-09-09 15^%43^%55

There was a setback… For the functions to become available, it was necessary to manually import them in each QGIS session. Not very practical. Again with Nathan’s help, I found out that it’s possible to import python modules at QGIS startup by putting a startup.py file with the import statements in the .qgis2/python folder. In my case, this was enough.

import userfunctions

I was pretty satisfied with the end result. The ability to create your own functions in expressions demonstrates once more how easy it is to customize QGIS and create your own tools. I’m already thinking in more applications for this amazing functionality.

UT 9 - Qta da Peninha - Vegetação potencial

You can download the Python files with the functions HERE. Just unzip both files to the .qgis2/python folder, and restart QGIS, and the functions should become available.

Disclaimer: I’m not an English native speaker, therefore I apologize for any errors, and I will thank any advice on how to improve the text.

Oh God my plugins! My precious QGIS plugins

tl;dr

The API had to change. We don't like breaking plugins. It makes us sad. We did it now to save pain in the future. You will like the new version better. Trust me.

What happened to my cheese?

When updating to QGIS 2.0 you might have noticed two things. (Apart from all the new awesome stuff!)

  1. All your settings have been set back to defaults
  2. Some of your plugins are gone, or missing in the installer.

Resetting of settings is caused by QGIS now storing its 2.0 settings in a different folder then we used for 1.8. In 1.8, all your plugins, etc, were stored in the ./qgis folder in your home directory, in 2.0 these are now stored in ./qgis2. The reason will become evident later. All user settings, the UI layout, database connections, etc, are now stored in a QGIS location. In windows this in the registry key HKEY_CURRENT_USER\Software\QGIS\QGIS2. So this explains why your settings are missing when you first load QGIS 2.0.

Why did we have to move the settings location?

Good question.

2.0 is very different from 1.8. There has been a lot of work to make this the best version we have ever made, new features, more bug fixes, a bigger dev team, and a even bigger community. Being the next major release we had to make some calls to remove some of the old code and features that were weighing us down. Features such as the old labelling engine, old symbol engine, the old vector API. Carrying old code and old out dated features into the future can sometimes really hurt a project and they have to be cut loose. Because of the massive amount of changes in 2.0 people needed to be able to run 2.0 and 1.8 on the same machine without causing any issues. If they both store settings in the same location this would have had bad results.

Why move the settings. Part 2

Moving the settings was also a result of having non backwards compatible plugins between 1.x and 2.x. If we kept both plugins in the same folder it just wouldn't work. You would install a 1.8 version of a plugin, I would update my plugin to 2.0, you then install the same plugin in 2.0, and now you 1.8 version is broken. Fun!. To avoid this we moved all QGIS 2.0 stuff into ./qgis2.

Why did my plugins break anyway. Why not just leave them be.

In 1.x we were using SIP v1. This meant the framework we used, PyQt, felt more like C++ then it did Python. If you are a Python developer then this isn't a very fun thing to deal with. In SIP v1 you need to tell PyQt, and our QGIS API, what to convert the type to. feature['mycolumn'].toInt() that is pretty gross. In V2 you can just do feature['mycolumn'] and SIP will auto convert the type for you. This makes our API feel more like Python and less like C++. There are other changes when using SIP V2 but you get the idea. Unfortunately SIP v1 and v2 do not work together so we couldn't make the code backwards compatible. This was also a forced change for us. Once we switch to Python 3 at some stage in the future V2 would be the default and we have to change then. The bright side of this change is most of the time you are removing code. Consider it a good time to go though your code, give it a bit of a polish, and remove anything that is no longer needed.

There was another major API change that needed to happen. Vector API update. In order to allow for multithreading in the future, and we all know everyone is looking forward to that, we needed to change how code can ask for features from a layer. The old method would never work in a multithreaded structure and had to go.

What can I do if I need a plugin?

Having a plugin missing from the plugin installer when you really need it can be a bit of a pain. Plugin authors are working hard to update there plugins. I approve about two a day into the plugin repository. While most plugins might be updated at some stage. There are some things that you can do if you need a plugin update to work with 2.0.

  1. Email the author of the plugin to see where they are at with the update

  2. Email the author and offer your help to update the plugin. Remember a lot of plugins are written by volunteers who just need the plugin to get their work done and wanted to share it with everyone.

  3. If the author has no intention of updating the plugin, or can't be contacted. You are free to update you local copy and offer it back to the community as the updated copy. If you are going to upload it back to the plugin repository please try to contact the author and seek permission first. I will be watching for copies of plugins to make sure we don't end up with 10 versions of the same plugin. The GPL allows for you to modify and share your updated version but it's nice to keep the original author in the loop. If the author no longer wants to maintain the plugin and you are able to then contact me and I will make you the owner of the plugin. Overall be nice not evil, we are all friends here.

  4. If you don't have, or know someone with, the skills to update the plugin. You can contact a developer to help update the plugin for you. Companies like mine, or Faunalia, or a whole range of other open source devs, can normally be contracted to update a plugin if needed.

Moving forward

We like the new API. It makes the Python side of QGIS much cleaner. There is still more work to do, it's never ending, but this is a good step. We don't like breaking plugins, but breaking a few now is better then breaking heaps as the popularity of QGIS continues to grow.

Oh God my plugins! My precious QGIS plugins

tl;dr

The API had to change. We don't like breaking plugins. It makes us sad. We did it now to save pain in the future. You will like the new version better. Trust me.

What happened to my cheese?

When updating to QGIS 2.0 you might have noticed two things. (Apart from all the new awesome stuff!)

  1. All your settings have been set back to defaults
  2. Some of your plugins are gone, or missing in the installer.

Resetting of settings is caused by QGIS now storing its 2.0 settings in a different folder then we used for 1.8. In 1.8, all your plugins, etc, were stored in the ./qgis folder in your home directory, in 2.0 these are now stored in ./qgis2. The reason will become evident later. All user settings, the UI layout, database connections, etc, are now stored in a QGIS location. In windows this in the registry key HKEY_CURRENT_USER\Software\QGIS\QGIS2. So this explains why your settings are missing when you first load QGIS 2.0.

Why did we have to move the settings location?

Good question.

2.0 is very different from 1.8. There has been a lot of work to make this the best version we have ever made, new features, more bug fixes, a bigger dev team, and a even bigger community. Being the next major release we had to make some calls to remove some of the old code and features that were weighing us down. Features such as the old labelling engine, old symbol engine, the old vector API. Carrying old code and old out dated features into the future can sometimes really hurt a project and they have to be cut loose. Because of the massive amount of changes in 2.0 people needed to be able to run 2.0 and 1.8 on the same machine without causing any issues. If they both store settings in the same location this would have had bad results.

Why move the settings. Part 2

Moving the settings was also a result of having non backwards compatible plugins between 1.x and 2.x. If we kept both plugins in the same folder it just wouldn't work. You would install a 1.8 version of a plugin, I would update my plugin to 2.0, you then install the same plugin in 2.0, and now you 1.8 version is broken. Fun!. To avoid this we moved all QGIS 2.0 stuff into ./qgis2.

Why did my plugins break anyway. Why not just leave them be.

In 1.x we were using SIP v1. This meant the framework we used, PyQt, felt more like C++ then it did Python. If you are a Python developer then this isn't a very fun thing to deal with. In SIP v1 you need to tell PyQt, and our QGIS API, what to convert the type to. feature['mycolumn'].toInt() that is pretty gross. In V2 you can just do feature['mycolumn'] and SIP will auto convert the type for you. This makes our API feel more like Python and less like C++. There are other changes when using SIP V2 but you get the idea. Unfortunately SIP v1 and v2 do not work together so we couldn't make the code backwards compatible. This was also a forced change for us. Once we switch to Python 3 at some stage in the future V2 would be the default and we have to change then. The bright side of this change is most of the time you are removing code. Consider it a good time to go though your code, give it a bit of a polish, and remove anything that is no longer needed.

There was another major API change that needed to happen. Vector API update. In order to allow for multithreading in the future, and we all know everyone is looking forward to that, we needed to change how code can ask for features from a layer. The old method would never work in a multithreaded structure and had to go.

What can I do if I need a plugin?

Having a plugin missing from the plugin installer when you really need it can be a bit of a pain. Plugin authors are working hard to update there plugins. I approve about two a day into the plugin repository. While most plugins might be updated at some stage. There are some things that you can do if you need a plugin update to work with 2.0.

  1. Email the author of the plugin to see where they are at with the update

  2. Email the author and offer your help to update the plugin. Remember a lot of plugins are written by volunteers who just need the plugin to get their work done and wanted to share it with everyone.

  3. If the author has no intention of updating the plugin, or can't be contacted. You are free to update you local copy and offer it back to the community as the updated copy. If you are going to upload it back to the plugin repository please try to contact the author and seek permission first. I will be watching for copies of plugins to make sure we don't end up with 10 versions of the same plugin. The GPL allows for you to modify and share your updated version but it's nice to keep the original author in the loop. If the author no longer wants to maintain the plugin and you are able to then contact me and I will make you the owner of the plugin. Overall be nice not evil, we are all friends here.

  4. If you don't have, or know someone with, the skills to update the plugin. You can contact a developer to help update the plugin for you. Companies like mine, or Faunalia, or a whole range of other open source devs, can normally be contracted to update a plugin if needed.

Moving forward

We like the new API. It makes the Python side of QGIS much cleaner. There is still more work to do, it's never ending, but this is a good step. We don't like breaking plugins, but breaking a few now is better then breaking heaps as the popularity of QGIS continues to grow.

GRASS GIS 6.4.3RC4 released

Fourth (and last) release candidate of GRASS GIS 6.4.3 with improvements and stability fixes
A fourth release candidate of GRASS GIS 6.4.3 is now available.

Source code download:

Binaries download:

To get the GRASS GIS 6.4.3RC4 source code directly from SVN:
 svn checkout http://svn.osgeo.org/grass/grass/tags/release_20130710_grass_6_4_3RC4

Key improvements of this release include some new functionality (assistance for topologically unclean vector data), fixes in the vector network modules, fixes for the wxPython based portable graphical interface (attribute table management, wxNVIZ, and Cartographic Composer), fixes in the location wizard for Datum transform selection and support for PROJ.4 version 4.8.0, improvements for selecting the Python version to be used, enhanced portability for MS-Windows (native support, fixes in case of missing system DLLs), and more translations (esp. Romanian).

See also our detailed announcement:
 http://trac.osgeo.org/grass/wiki/Release/6.4.3RC4-News

First time users should explore the first steps tutorial after installation.

Release candidate management at
http://trac.osgeo.org/grass/wiki/Grass6Planning

Please join us in testing this release candidate for the final release.

Consider to donate pizza or beer for the upcoming GRASS GIS Community Sprint in Prague:
Thanks to all contributors!

Installing Python setuptools into OSGeo4W Python

The easiest way install Python libraries is to use easy_install and pip.  easy_install and pip are package managers for Python. From the easy_install page:

Easy Install is a python module (easy_install) bundled with setuptools that lets you automatically download, build, install, and manage Python packages.

and from the pip page:

pip is a tool for installing and managing Python packages, such as those found in the Python Package Index. It’s a replacement for easy_install.

To get easy_install you need to install Python setuptools and you are good to go. Sounds easy!  However the setuptools installer assumes that you have the normal standalone Python installed which writes it’s install location to the registry, and when you run the installer it will say that it can’t find Python on the system.  What the!?

If you have installed QGIS, or any other tool from the OSGeo4W install, you will see that OSGeo4W bundles its own version of Python in: C:\OSGeo4W\apps\python27. This is the Python that is used when calling python in the OSGeo4W shell.  It seems someone on the OSGeo wiki has made a bootstrapped installer for setuptools that will install setuptools and easy_install into the  C:\OSGeo4W\apps\python27 folder for you.

Steps to take

  • Download ez_setup.py
  • Run python ez_setup.py in your OSGeo4W shell
  • Done!

To install a package with easy_install just use:

easy_install {packagename}

I wanted to have bottle and flask installed:

easy_install bottle

which gives you something like:

Searching for bottle
Reading http://pypi.python.org/simple/bottle/
Reading http://bottle.paws.de/
Reading http://github.com/defnull/bottle
Reading http://bottlepy.org/
Best match: bottle 0.11.4
Downloading http://pypi.python.org/packages/source/b/bottle/bottle-0.11.4.tar.gz#md5=f767c340de0b7c9581917c48e609479b
Processing bottle-0.11.4.tar.gz
Running bottle-0.11.4\setup.py -q bdist_egg –dist-dir c:\users\woo\appdata\local\temp\easy_install-5b4qq6\bottle-0.11.4\egg-dist-tmp-q2yd68
zip_safe flag not set; analyzing archive contents…
bottle: module references __file__
bottle: module references __path__
Adding bottle 0.11.4 to easy-install.pth file
Installing bottle.py script to C:\OSGeo4W\apps\Python27\Scripts
Installed c:\osgeo4w\apps\python27\lib\site-packages\bottle-0.11.4-py2.7.egg
Processing dependencies for bottle
Finished processing dependencies for bottle

Install pip

Note

Most of the time any Python packages that are needed by your OSGeo4W tools are bundled in the installer and can be downloaded using the OSGeo4W installer, however there have been cases when I wanted to install a non OSGeo4W package into my setup by using easy_install or pip. Like bottle and flask in the example above.


Filed under: Open Source Tagged: FOSSGIS, osgeo, python, qgis, Quantum GIS

Remote debugging QGIS plugins using PyCharm

Wow it’s been a long time since I posted anything here….but I’m back with another (hopefully) useful howto. Over the last six months or so I have made a rather dramatic shift away from using VIM as my primary development environment for everything to using IDE’s for my python development work. I spent the first... Read more »

Remote debugging QGIS plugins using PyCharm

Wow it's been a long time since I posted anything here....but I'm back with another (hopefully) useful howto. Over the last six months or so I have made a rather dramatic shift away from using VIM as my primary development environment for everything to using IDE's for my python development work. I spent the first 4 or so of those months using Eclipse PyDev which is excellent but has certain issues, particularly from a QGIS context. Most of the issues relate to my not being able to get it to reliably recognise the QGIS API after adding the QGIS python directories to the package search path and the lack of decent refactoring tools. In my java programming days of yore, I used to love the refactoring tools that Eclipse provided, but unfortunately these are not carried through to PyDev.

Lately I have been using PyCharm which is a dedicated python IDE also written in Java (ironic). I'm not going to attempt to review all of PyCharm here (Guido van Rossum already did that but bear in mind the review is two years old and the software has surely advanced a lot since then). I will say there are two things that I find not as pleasant to use in PyCharm (compared to PyDev) - its PyLint and PEP8 checkers. In the projects I work on we follow PEP8 very strictly and not having a proper integrated tool for this is a real PITA. There are work arounds however so it is not a lost cause, but it would be nice if the developers took this a little more seriously:

On the other hand, I simply do not see the value of highlighting PEP 8 violations in the editor, especially in the middle of code modifications. We'd rather spend our time implementing inspections that report actual problems with the code, and not formatting nit-picks. - Dmitry Jemerov

One other big downside to PyCharm is that it is not open source software which goes against the grain somewhat. They do however provide free licensing to those using it for bona fide open source development work, so don't let the price tag deter you if you are planning to use if for a FOSS project.

Read on to see how I use PyCharm for QGIS development...

PyCharm for QGIS remote debugging

Why use PyCharm? Well it happens to be a really nice platform for QGIS python / python plugin development. For this article I am going to focus on the steps needed to remote debug python scripts / plugins running inside QGIS. The process is conceptually the same as remote debugging using Eclipse/PyDev - which I have blogged about previously. I am going to start with the assumption that you have your PyCharm set up and your plugin basics in place and now you are at the point where you wish to debug your software while it is running in QGIS. Let me prefix this by saying that I very seldom need to use this technique since our code is heavily tested (by means of a python test suite) so in most cases I can just debug a particular test directly without needing to remotely attach to a python process in QGIS. So the first thing you need to do is set up a python debug server (provided as part of PyCharm). To do this,  choose edit configurations from the task list:

Create a new run configuration

Next click on the little '+' icon and choose python remote debug:

image1

Set the following options:

  • Local host name: localhost
  • Port: 53100

Note that you can use any high port that you like (assuming it is unused).

You will notice that on the dialog it gives you some handy hints as to what needs to be inserted into your code in order to enable the trace point.

Creating the debug server run configuration

The next thing you need to do is add a couple of lines to the module that you wish to debug (this is also described in the above dialog). First, in your imports add this:

from pydev import pydevd

And then in the place where you wish execution to halt, add this line:

pydevd.settrace('192.168.1.62',
                port=53100,
                stdoutToServer=True,
                stderrToServer=True)

You can also try using 'localhost' instead of your IP address.

The last thing you need to have in place before you can test is pydevd needs to be in your PYTHONPATH in the context of the running plugin. In my case I simply extracted the pydevd egg supplied in the root of the PyCharm installation into my plugin directory:

cp ~/apps/pycharm-2.5.2/pycharm-debug.egg .
unzip pycharm-debug.egg

There are a number of other ways you could do this, for example by changing your code to add the pydev directory into sys.path.

Ok now you are all set. One thing to remember is that the settrace line is just the initial breakpoint - you can set additional breakpoints in your code using normal PyCharm debugging techniques. Now launch your PyCharm debug server configuration by clicking the little run icon next to it (highlighted in red below):

Start the debug server

After this you will see some output like this in the PyCharm run panel:

Starting debug server at port 53100
Waiting for connection...

Next fire up your copy of QGIS and open the plugin that will trigger your settrace. When the trace point is hit, PyCharm will enter debug mode and highlight the trace line in blue like this:

PyCharm debugging a remote process

Now you can step through your code, inspect variables and generally have a productive time understanding your code. I'll hopefully post a follow up article on how to set up PyCharm for QGIS development in the future! Happy hacking!

  • <<
  • Page 4 of 6 ( 102 posts )
  • >>
  • python

Back to Top

Sustaining Members