Related Plugins and Tags

QGIS Planet

(Fr) Oslandia recrute : Ingénieur(e) développement d’applications SIG ( Python / SQL / QGIS ) – OSL2110A

Sorry, this entry is only available in French.

(Fr) QSoccer : QGIS, football, what else ?

Sorry, this entry is only available in French.

(Fr) Oslandia recrute : ingénieur(e) développement C++ / Python – OSL2011B

Sorry, this entry is only available in French.

Store and visualize your raster in the Cloud with COG and QGIS

We have recently been working for the French Space Agency ( CNES ) who needed to store and visualize satellite rasters in a cloud platform. They want to access the image raw data, with no transformation, in order to fullfill deep analysis like instrument calibration. Using classic cartographic server standard like WMS or TMS is not an option because those services transform datasets in already rendered tiles.

We chose to use a quite recent format managed by GDAL, the COG (Cloud Optimize Geotiff) and target OVH cloud platform for it provides OpenStack, a open source cloud computing platform.

How it works

A COG file is a GEOTiff file which inner structure is tiled, meaning that the whole picture is divided in fixed size tile (256 x 256 pixels for instance) so you can efficiently retrieve parts of the raster. In addition to the HTTP/1.1 standard feature range request, it is possible to get specific tiles of an image through the network without downloading the entire raster.

We used a service provided by OpenStack, called Object Storage to serve the COG imagery. Object storage allows to store and retrieve file as objects using HTTP GET/POST requests.

Why not WCS ?

Web Coverage Service standard could have been an option. A WCS server can serve raw data according to a given geographic extent. It’s completely possible to deploy a container or a VPS (Virtual Private Server) running a WCS Server in a cloud plateform. The main advantages of the COG solution over WCS Server is that you don’t have to deal with the burden of deploying a server, like giving it ressources, configuring load balancing, handle updates, etc…

The beauty of COG solution is its simplicity. It is only HTTP requests, and everything else (rendering for instance) is done on the client side.

Step by step

Here are the different steps you’d have to go through if you’re willing to navigate in a big raster image directly from the cloud.

First, let’s generate a COG file

gdal_translate inputfile.tif cogfile.tif -co TILED=YES -co COPY_SRC_OVERVIEWS=YES -co COMPRESS=DEFLATE

Install your openstack-client, it can be achieved easily with Python pip install command line

$ pip install python-openstackclient

Next, configure your openstack client in order to generate an athentification token. To do so you need to download your project specific openrc file to setup your environment)

$ source myproject-openrc.sh
Please enter your OpenStack Password for project myproject as user myuser:
**********
$ openstack token issue                                 
+------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Field      | Value                                                                                                                                                                                   |
+------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| expires    | 2020-07-21T08:15:12+0000                                                                                                                                                                |
| id         | xxxx_my_token_xxxx
| project_id | 97e2e750f1904b41b76f80a50dabde0a                                                                                                                                                        |
| user_id    | 18f7ccaf1a2d4344a4e35f0d84eb065e                                                                                                                                                        |
+------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

You are now good to push you COG file to the cloud instance

openstack object create MyContainer cogfile.tif --name cogfile.tif

Before starting QGIS, 2 environment variables need to be set.  (replace xxxx_my_token_xxxx with the token you’d just come to generate)

$ export SWIFT_AUTH_TOKEN=xxxx_my_token_xxxx
$ export SWIFT_STORAGE_URL=https://storage.sbg.cloud.ovh.net/v1/AUTH_$OS_PROJECT_ID

It can also be done directly from the QGIS Python console by setting those variable using the os.environ.

Finally, add a cloud raster data source in in QGIS

You can now navigating into your image directly reading it from the cloud

© CNES 2018, Distribution Airbus DS

Performances

While panning in the map, QGIS will download only few tiles from the image in order to cover the view extent. The display latency that you could see in the video depends essentially on:

  • The number of band of your image
  • The pixel size
  • Your internet connection (mine, the one use for the video, is not an awesome one)

Note that the white flickering that you could see when you move in the map and the raster is refreshed should be removed in next version of QGIS according to this QEP.

What’s next ?

Thanks so much to the GDAL and QGIS contributors for adding such a nice feature ! It brings lots of possibilities for organizations that have to deal with great number of big raster and just want to explore part of it.

We are already thinking about further improvments (ease authentification, better integration with processing…), so if you’re willing to fund them or just want to know more about QGIS, feel free to contact us at [email protected]. And please have a look at our support offering for QGIS.

Publication de l’extension COVADIS RAPEA pour QWAT et QGEP

QWAT est une application open source de gestion des réseaux d’eau potable émanant des collectivités de Pully, le SIGE à Vevey, Morges et Lausanne.
QGEP est son homologue dédiée à la gestion des eaux usées et pluviales, initiée par le groupe utilisateur QGIS Suisse.

L’échange de données entre institutions est une pierre angulaire des politiques de l’eau. Ces échanges se basent sur des formats d’échanges standardisés. Ainsi les Cantons de Fribourg (format aquaFRI) ou de Vaud (format SIRE) conditionnent certaines subventions publiques à la transmission des données selon des formats pré-définis et permettent à ces échelons administratifs d’avoir une vision globale des réseaux humides.

Dans le cadre d’une expérimentation des outils QWAT (eau potable) et QGEP (eaux usées), Charentes Eaux a souhaité mettre en œuvre des extensions dédiées au standard d’échange de données sur les réseaux d’eau Français, le Géostandard Réseaux d’adduction d’eau potable et d’assainissement (RAEPA) défini par la Commission de validation des données pour l’information spatialisée (COVADIS).

Oslandia a été mandaté pour mettre en œuvre des instances de QWAT et QGEP, réaliser les extensions RAEPA pour chacun de ces outils, et aider Charente Eaux à charger les données des collectivités membres de ce syndicat mixte.

https://charente-eaux.fr/le-syndicat/qui-sommes-nous/

Le travail a été publié pour QWAT sous forme d’une extension standardisée dans le dépôt l’organisation QWAT https://github.com/qwat/extension_fr_raepa/

Pour QGEP, il n’existe pas encore de fonctionnalité pour gérer d’extension, le dépôt https://gitlab.com/Oslandia/qgep_extension_raepa/ contient donc les définitions de données et de vues à rajouter manuellement au modèle de données.

La compatibilité des modèles de données a été évaluée et le choix a été fait de ne faire que des vues dédiées à l’export de données. Il est techniquement possible de faire des vues éditables pour permettre le chargement de données via ces vues depuis des fichiers suivant le gabarit de données RAEPA. Le niveau de simplification et d’agrégation des listes de valeurs rend ce travail peu générique dans l’état actuel du géostandard (v1.1), il est donc plus pertinent à ce stade de réaliser des scripts de chargement sans passer par ce pivot dans le cas de Charente-Eaux

(Fr) Financement mutualisé du logiciel libre: le cas QGIS

Sorry, this entry is only available in French.

(Fr) Entretien avec Vincent Picavet

Sorry, this entry is only available in French.

Who is behind QGIS at Oslandia ?

You are using QGIS and look for support services to improve your experience and solve problems ? Oslandia is here to help you with our full QGIS editor service range ! Discover our team members below.

You will probably interact first with our pre-sales engineer Bertrand Parpoil. He leads Geographical Information System projects for 15 years for large corporations, public administrations or hi-tech SME. Bertrand will listen to your needs and explore your use cases, to offer you the best set of services.

Régis Haubourg also takes part in the first steps of projects to analyze your usages and improve them. GIS Expert, he knows QGIS by heart and will make the most of its capabilities. As QGIS Community Manager at Oslandia, he is very active in the QGIS community of developers and contributors. He is president of the Francophone OSGeo local chapter ( OSGeo-fr ), QGIS voting member, organizes the French QGIS day conference in Montpellier, and participates to QGIS community meetings. Before joining Oslandia, he led the migration to QGIS and PostGIS at the Adour-Garonne Water Agency, and now guides our clients with their GIS migrations to OpenSource solutions. Régis is also a great asset when working on water, hydrology and other specific thematic subjects.

Loïc Bartoletti develops QGIS, specializing in features corresponding to his fields of interests : network management, topography, urbanism, architecture… We find him contributing to advanced vector editing in QGIS, writing Python plugins, namely for DICT management. Pushing CAD and migrations from CAD tools to GIS and QGIS is one of his major goals. He will develop your custom applications, combining technical expertise and functional competences. When bored, Loïc packages software on FreeBSD.

Vincent Mora is senior developer in Python and C++, as well as PostGIS expert. He has a strong experience in numerical simulation. He likes coupling GIS (PostGIS, QGIS) with 3D numerical computing for risk management or production optimization. Vincent is an official QGIS committer and can directly integrate your needs into the core of the software. He is also GDAL committer and optimizes low-level layers of your applications. Among numerous activities, Vincent serves as lead developer of the development team for Hydra Software, a tool dedicated to unified hydraulics and hydrology modelling and simulation based on QGIS.

Hugo Mercier is an officiel QGIS committer too for several years. He regularly talks in international conferences on PostGIS, QGIS and other OpenSource GIS softwares. He will implement your needs with new QGIS features, develop innovative plugins ( like QGeoloGIS ) and design and build your new custom applications, solving all kind of technological challenges.

Paul Blottière completes our QGIS committers : very active on core development, Paul has refactored the QGIS server component to bring it to an industry-grade quality level. He also designed and implemented the infrastructure allowing to guarantee QGIS server performances. He dedicated himself to QGIS server OGC certification, especially for WMS (1.3). Thanks to this work QGIS is now a reference OGC implementation.

Julien Cabièces recently joined Oslandia, and quickly dived into QGIS : he contributes to the core of this Desktop GIS, on the server component, as well as applications linked to numerical simulation. Coming from a satellite imagery company with industrial applications, he uses his flexibility to answer all your needs. He brings quality and professionalism to your projects, minimizing risks for large production deployments.

You may also meet Vincent Picavet. Oslandia’s founder is a QGIS.org voting member, and is involved in the project’s evolution and the organization of the community.

Aside from these core contributors, all other Oslandia members also master QGIS integrate this tool into their day-to-day projects.

Bertrand, Régis, Loïc, Vincent (x2), Hugo, Paul et Julien are in tune with you and will be happy to work together for your migrations, application development, and all your desires to contribute to the QGIS ecosystem. Do not hesitate to contact us !

(Fr) Oslandia recrute : Ingénieur(e) développement d’applications SIG ( Python / SQL / QGIS )

Sorry, this entry is only available in French.

QGIS Snapping improvements

A few months ago, we proposed to the QGIS grant program to make improvements to the snap cache in QGIS. The community vote selected our project which was funded by QGIS.org. Developments are now mostly finished.

In short, snapping is crucial for editing geospatial features. It is the only way to ensuring they are topologically related, ie, connected vertices have exactly the same coordinates even if manual digitizing on screen is imprecise by nature.  Snapping correctly supposes QGIS have in memory an indexed cache of the geometries to snap to. And maintainting this cache when data is modified, sometimes by another user or database logic, can be a real challenge. This it exactly what this work adresses.

The proposal was divided into two different tasks:

  • Manage circular dependencies
  • Relax the snap cache index build

Manage cicular data dependencies

Data dependencies

Data dependency is an existing feature that allows you to configure QGIS to reload layers (and their snapping cache) when a layer is modified.

It is useful when you store your data in a database and you set up triggers to maintain consistency between the different tables of your data model.

For instance, say you have topological informations containing lines and nodes. Nodes are part of lines and lines go through nodes. Then, you move a node in QGIS, and save your modifications to the database. In order to keep the data consistent, a trigger updates the geometry of the line going through the modified node.

Node 2 is modified, Line 1 is updated accordingly

QGIS, as a database client, has no information that the line layer currently displayed in the canvas needs to be refreshed after the trigger. Although the map canvas will be up to date, because QGIS fetches data for display without any caching system, the snapping cache is not and you’ll end up with ghost snapping highlights issues.

Snapping highlights (light red) differ from real line (orange)

Defining a dependency between nodes and lines layers tells QGIS that it has to refresh the line layer when a node is modified.

Dependencies configuration: Lines layer will be refreshed whenever Nodes layer is modified

It also have to work the other way, modifying a line should update the nodes to ensure they still are on the line.

Circular data dependencies

So here we are, lines depend on nodes which depend on lines which depend on nodes which…

That’s what circular dependencies is about. This specific behavior was previously forbidden and needed a special way to deal with it. Thanks to this recent development, it is now possible.

It’s also possible to add the layer itself as one of its own dependencies. It helps dealing with specific cases where one feature modification could lead to a modification of another feature in the same layer (to keep consistency on road networks for instance).

Road 2 is modified, Road 1 is updated accordingly

This feature is available in the next QGIS LTR version 3.10.

Relax the snapping cache index build

If you work in QGIS with huge projects displaying a lot of vector data, and you enable snapping while editing these data, you probably already met this dialog:

Snap indexing dialog

This dialog informs you that data are currently being indexed so you can snap on them while you will edit feature geometry. And for big projects, this dialog can last for a really long time. Let’s work on speeding it up!

What’s a snap index?

Let’s say you want to move a line and snap it onto another one. While you drag your line with the mouse, QGIS will look for an existing geometry beneath the mouse cursor (with a certain pixel tolerance) every time you move your mouse. Without spatial index, QGIS will have to go through every geometry in your layer to check if the given geometry is beneath the cursor position. This would be very ineffective.

In order to prevent this, QGIS keeps an index where vector data are stored in a way that it can quickly find out what geometry is beneath the mouse cursor. The building of this data structure takes time and that is what the progress dialog is about.

Firstly: Parallelize snap index build

If you want to be able to snap on all layers in your project, then QGIS will have to build one snap index for each layer. This operation was made sequentially meaning that if you have for instance 20 layers and the index building last approximatively 3 seconds for each, then the whole index building will last 1 minute. We made modifications to QGIS so that index building could be done in parallel. As a result, the total index building time could theoretically be 3 seconds!

4 layers snap index being built in parallel

However, parallel operations are limited by the number of CPU cores of your machine, meaning that if you have 4 cores (core i7 for instance) then the total time will be up to 4 times faster than when the building is sequential (and last 15 seconds in our example).

Secondly: relax the snap build

For big projects, parallelizing index building is not enough and still takes too much time. Futhermore, to reduce snap index building, an existing optimisation was to build the spatial index for a specific area of interest (determined according to the displayed area and layer size). As a consequence, when you’ve done waiting for an index currently building and you move the map or zoom in/out, you could possibly trigger another snap index building and wait again.

So, the idea was to avoid waiting at all. Snap index is now built whenever it needs to (when you first enable snapping, when you move or zoom) but the user doesn’t have to wait for the build to be over and can continue what it was doing (creating feature, moving…). Snapping highlights will be missing when the index is currently being built and will appear gradually as soon as they finished. That’s what we call the relaxing mode.

No waiting dialog, snapping highlights appears as soon as snap index is ready

This feature has been merged into current QGIS master and will be present in future QGIS 3.12 release. We keep working on this feature in order to make it more stable and efficient.

What’s next

We’ll continue to improve this feature in the coming days, if you have the chance to test it and encounter issues please let us know on the QGIS tracker. If you think about a missing feature or just want to know more about QGIS, feel free to contact us at [email protected]. And please have a look at our support offering for QGIS.

Many thanks to QGIS grant program for funding these new features. Thanks also to all the people involved in reviewing the code and helping to better understand the existing mechanism.

 

(Fr) Rechercher une adresse avec QGIS

Sorry, this entry is only available in French.

QGIS Versioning now supports foreign keys!

QGIS-versioning is a QGIS and PostGIS plugin dedicated to data versioning and history management. It supports :

  • Keeping full table history with all modifications
  • Transparent access to current data
  • Versioning tables with branches
  • Work offline
  • Work on a data subset
  • Conflict management with a GUI

QGIS versioning conflict management

In a previous blog article we detailed how QGIS versioning can manage data history, branches, and work offline with PostGIS-stored data and QGIS. We recently added foreign key support to QGIS versioning so you can now historize any complex database schema.

This QGIS plugin is available in the official QGIS plugin repository, and you can fork it on GitHub too !

Foreign key support

TL;DR

When a user decides to historize its PostgreSQL database with QGIS-versioning, the plugin alters the existing database schema and adds new fields in order to track down the different versions of a single table row. Every access to these versioned tables are subsequently made through updatable views in order to automatically fill in the new versioning fields.

Up to now, it was not possible to deal with primary keys and foreign keys : the original tables had to be constraints-free.  This limitation has been lifted thanks to this contribution.

To make it simple, the solution is to remove all constraints from the original database and transform them into a set of SQL check triggers installed on the working copy databases (SQLite or PostgreSQL). As verifications are made on the client side, it’s impossible to propagate invalid modifications on your base server when you “commit” updates.

Behind the curtains

When you choose to historize an existing database, a few fields are added to the existing table. Among these fields, versioning_ididentifies  one specific version of a row. For one existing row, there are several versions of this row, each with a different versioning_id but with the same original primary key field. As a consequence, that field cannot satisfy the unique constraint, so it cannot be a key, therefore no foreign key neither.

We therefore have to drop the primary key and foreign key constraints when historizing the table. Before removing them, constraints definitions are stored in a dedicated table so that these constraints can be checked later.

When the user checks out a specific table on a specific branch, QGIS-versioning uses that constraint table to build constraint checking triggers in the working copy. The way constraints are built depends on the checkout type (you can checkout in a SQLite file, in the master PostgreSQL database or in another PostgreSQL database).

What do we check ?

That’s where the fun begins ! The first thing we have to check is key uniqueness or foreign key referencing an existing key on insert or update. Remember that there are no primary key and foreign key anymore, we dropped them when activating historization. We keep the term for better understanding.

You also have to deal with deleting or updating a referenced row and the different ways of propagating the modification : cascade, set default, set null, or simply failure, as explained in PostgreSQL Foreign keys documentation .

Nevermind all that, this problem has been solved for you and everything is done automatically in QGIS-versioning. Before you ask, yes foreign keys spanning on multiple fields are also supported.

What’s new in QGIS ?

You will get a new message you probably already know about, when you try to make an invalid modification committing your changes to the master database

Error when foreign key constraint is violated

Partial checkout

One existing Qgis-versioning feature is partial checkout. It allows a user to select a subset of data to checkout in its working copy. It avoids downloading gigabytes of data you do not care about. You can, for instance, checkout features within a given spatial extent.

So far, so good. But if you have only a part of your data, you cannot ensure that modifying a data field as primary key will keep uniqueness. In this particular case, QGIS-versioning will trigger errors on commit, pointing out the invalid rows you have to modify so the unique constraint remains valid.

Error when committing non unique key after a partial checkout

Tests

There is a lot to check when you intend to replace the existing constraint system with your own constraint system based on triggers. In order to ensure QGIS-Versioning stability and reliability, we put some special effort on building a test set that cover all use cases and possible exceptions.

What’s next

There is now no known limitations on using QGIS-versioning on any of your database. If you think about a missing feature or just want to know more about QGIS and QGIS-versioning, feel free to contact us at [email protected]. And please have a look at our support offering for QGIS.

Many thanks to eHealth Africa who helped us develop these new features. eHealth Africa is a non-governmental organization based in Nigeria. Their mission is to build stronger health systems through the design and implementation of data-driven solutions.

(Fr) Oslandia recrute : développeur(se) C++ et Python

Sorry, this entry is only available in French.

QGIS 3 and performance analysis

Context

Since last year we (the QGIS communtity) have been using QGIS-Server-PerfSuite to run performance tests on a daily basis. This way, we’re able to monitor and avoid regressions according to some test scenarios for several QGIS Server releases (currently 2.18, 3.4, 3.6 and master branches). However, there are still many questions about performance from a general point of view:

  • What is the performance of QGIS Server compared to QGIS Desktop?
  • What are the implications of feature simplification for polygons and lines?
  • Does the symbology have a strong impact on performance and in which proportion?

Of course, it’s a broad and complex topic because of the numerous possibilities offered by the rendering engine of QGIS. In this article we’ll look at typical use cases with geometries coming from a PostgreSQL database.

Methodology

The first way to monitor performance is to measure the rendering time. To do so, the Map canvas refreshis activated in the Settings of QGIS Desktop. In this way we can get the rendering time from within the Rendering tab of log messages in QGIS Desktop, as well as from log messages written by QGIS Server.

The rendering time retrieved with this method allows to get the total amount of time spent in rendering for each layer (see the source code).

But in the case of QGIS Server another interesting measure is the total time spent for a specific request, which may be read from log messages too. There are indeed more operations achieved for a single WMS request than a simple rendering in QGIS Desktop:

The rendering time extracted from QGIS Desktop corresponds to the core rendering time displayed in the sequence diagram above. Moreover, to be perfectly comparable, the rendering engine must be configured in the same way in both cases. In this way, and thanks to PyQGIS API, we can retrieve the necessary information from the Python console in QGIS Desktop, like the extent or the canvas size, in order to configure the GetMap WMS request with the appropriate WIDTH,, HEIGHT , and BBOX parameters.

Another way to examine the performance is to use a profiler in order to inspect stack traces. These traces may be represented as a FlameGraph. In this case, debug symbols are necessary, meaning that the rendering time is not representative anymore. Indeed, QGIS has to be compiled in Debug mode.

Polygons

For these tests we use the same dataset as that for the daily performance tests, which is a layer of polygons with 282,776 features.

Feature simplification deactivated

Let’s first have a look at the rendering time and the FlameGraph when the simplification is deactivated. In QGIS Desktop, the mean rendering time is 2591 ms. Using to the PyQGIS API we are able to get the extent and the size of the map to render the map again but using a GetMap WMS request this time.

In this case, the rendering time is 2469 ms and the total request time is 2540 ms. For the record, the first GetMap request is ignored because in this case, the whole QGIS project is read and cached, meaning that the total request time is much higher. But according to those results, the rendering time for QGIS Desktop and QGIS Server are utterly similar, which makes sense considering that the same rendering engine is used, but it is still very reassuring :).

Now, let’s take a look to the FlameGraph to detect where most of the time is spent.

 

Undoubtedly the FlameGraph’s are similar in both cases, meaning that if we want to improve the performance of QGIS Server we need to improve the performance of the core rendering engine, also used in QGIS Desktop. In our case the main method is QgsMapRendererParallelJob::renderLayerStatic where most of the time is spent in:

Methods Desktop % Server %
QgsExpressionContext::setFeature 6.39 6.82
QgsFeatureIterator::nextFeature 28.77 28.41
QgsFeatureRenderer::renderFeature 29.01 27.05

Basically, it may be simplified like:

Clearly, the rendering takes about 30% of the total amount of time. In this case geometry simplification could potentially help.

Feature simplification activated

Geometry simplification, available for both polygons and lines layers, may be activated and configured through layer’s Properties in the Rendering tab. Several parameters may be set:

  • Simplification may be deactivated
  • Threshold for a more drastic simplification
  • Algorithm
  • Provider simplification
  • Scale

Once the simplification activated, we varied the threshold as well as the algorithm in order to detect performance jumps:

The following conclusions can be drawn:

  • The Visvalingam algorithm should be avoided because it begins to be efficient with a high threshold, meaning a significant lack of precision in geometries
  • The ideal threshold for Snap To Grid and Distance algorithms seems to be 1.05. Indeed, considering that it’s a very low threshold, the precision of geometries is still pretty good for a major improvement in rendering time though

For now, these tests have been run on the full extent of the layer. However, we still have a Maximum scale parameter to test, so we’ve decreased the scale of the layer:

And in this case, results are pretty interesting too:

Several conclusions can be drawn:

  • Visvalingam algorithm should be avoided at low scale too
  • Snap To Grid seems counter-productive at low scale
  • Distance algorithm seems to be a good option

Lines

For these tests we also use the same dataset as that for daily performance tests, which is a layer of lines with 125,782 features.

Feature simplification activated

In the same way as for polygons we have tested the effect of the geometric simplification on the rendering time, as well as algorithms and thresholds:

In this case we have exactly the same conclusion as for polygons: the Distance algorithm should be preferred with a threshold of 1.05.

For QGIS Server the mean rendering time is about 1180 ms with geometry simplification compared to 1108 ms for QGIS Desktop, which is totally consistent. And looking at the FlameGraph we note that once again most of the time is spent in accessing the PostgreSQL database (about 30%) and rendering features (about 40%).

 

 

 

 

 

Symbology

Another parameter which has an obvious impact on performance is the symbology used to draw the layers. Some features are known to be time consuming, but we’ve felt that a a thorough study was necessary to verify it.

 

Firstly, we’ve studied the influence of the width as well as the Single Symbol type on the rendering time.

Some points are noteworthy:

Simple Line is clearly the less time consuming

– Beyond the default 0.26 line width, rendering time begins to raise consequently with a clear jump in performance

 

Another interesting feature is the Draw effects option, allowing to add some fancy effects (shadow, glow, …).

However, this feature is known to be particularly CPU consuming. Actually, rendering all the 125,782 lines took so long that we had to to change to a lower scale, with just some a few dozen lines. Results are unequivocal:

 

The last thing we wanted to test for symbology is the effect of the Categorized classification. Here are the results for some classifications with geometry simplification activated:

  • No classification: 1108 ms
  • A simple classification using the column “classification” (8 symbols): 1148 ms
  • A classification based on a stupid expression “classification x 3″ (8 symbols): 1261 ms
  • A classification based on string comparison “toponyme like ‘Ruisseau*'” (2 symbols): 1380 ms
  • A classification with a specific width line for each category (8 symbols): 1850 ms

Considering that a simple classification does not add an excessive extra-cost, it seems that the classification process itself is not very time consuming. However, as soon as an expression is used, we can observe a slight jump in performance.

Labeling

Another important part to study regarding performance is labeling and the underlying positioning. For this test we decreased the scale and varied the Placement parameter without tuning anything.

Clearly, the parallel labeling is much more time consuming than the other placements. However, as previously stated, we used the default parameters for each positioning, meaning that the number of labels really drawn on the map differs from a placement to another.

Points

The last kind of geometries we have to study is points. Similarly to polygons and lines, we used the same dataset as that of performance tests, that is a layer with 435588 points.

In the case of points geometries geometry simplification is of course not available. So we are going to focus on symbology and the impact of marker size.

Obviously Font Marker must be used carefully because of the underlying jump in performance, as well as SVG Symbols. Moreover, contrary to Simple Marker, an increase of the size implies a drastic augmentation in time rendering.

General conclusion

Based on this factual study, several conclusions can be drawn.

Globally, FlameGraph for QGIS Desktop and QGIS Server are completely similar as well as rendering time.

It means that if we want to improve the performance of QGIS Server, we have to work on the desktop configuration and the rendering engine of the QGIS core library.

Extracting generic conclusions from our tests is very difficult, because it clearly depends on the underlying data. But let’s try to suggest some recommendations :).

Firstly, geometry simplification seems pretty efficient with lines and polygons as soon as the algorithm is chosen cautiously, and as long as your features include many vertices. It seems that the Distance algorithm with a 1.05 threshold is a good choice, with both high and low scale. However, it’s not a magic solution!

Secondly, a special care is needed with regards to symbology. Indeed, in some cases, a clear jump in performance is notable. For example, fancy effects and Font Marker SVG Symbol have to be used with caution if you’re picky on rendering time.

Thirdly, we have to be aware of the extra cost caused by labeling, especially the Parallel  placement for line geometries. On this subject, a not very well-known parameter allows to drastically reduce labeling time: the PAL candidates option. Actually, we may decrease the labeling time by reducing the number of candidates. For an explicit use case, you can take a look at the daily reports.

In any case, improving server performance in a substantial way means improving the QGIS core library directly.

Especially, we noticed thanks to FlameGraph that most of the time is spent in drawing features and managing the data from the PostgreSQL database. By the way, a legitimate question is: “How much time do we spend on waiting for the database?”. To be continued 😉

If you hit performance issues on your specific configuration or want to improve QGIS awesomeness, we provide a unique QGIS support offer at http://qgis.oslandia.com/ thanks to our team of specialists!

Funding for selective masking in QGIS is now complete!

Few months ago, Oslandia launched QGIS lab’s , a place to advertise our new ideas for QGIS, but also a place to help you find co funders to make dreams become reality.

The first initiative is about label selective masking, a feature that will allow us to achieve even more professional rendering for our maps.

Selective masking

 

Thanks to the commitment of the Swiss QGIS user group and local authorities, this work is now funded !

We now can start working hard to deliver you this great feature for QGIS 3.10

Thanks again to our funders

A last word, this is not a classical crowd funding initiative, but a classical contract for each funder.

No more reason not to contribute to free and open source software!

Nouvel outil d’édition des géométries dans QGIS : tronquer/prolonger

Sorry, this entry is only available in French.

Visualization of borehole logs with QGIS

At Oslandia, we have been working on a component based on the QGIS API for the visualization of well and borehole logs.

This component is aiming at displaying data collected vertically along wells dug underground. It mainly focuses on data organized in series of contiguous samples, and is generic enough to be used for both vertical (where the Y axis corresponds to a depth underground) and horizontal data (where the X axis corresponds to time for example). One of the main concerns was to ensure good display performances with an important volume of sample points (usually hundred of thousands sample points).

We have already been working on plot components in the past, but for specific QGIS plugins (both for timeseries and stratigraphic logs) and thought we will put these past experiences to good use by creating a more generic library.

We decided to represent sample points as geometry features in order to be able to reuse the rich symbology engine of QGIS. This allows users to represent their data whatever they like without having to rewrite an entire symbology engine. We also benefit from all the performance optimizations that have been added and polished over the years (on-the-fly geometry simplification for example).

Albeit written in Python, we achieved good display performances. The key trick was to avoid copy of data between the sample points read by QGIS and the Python graphing component. It is achieved thanks to the fact that the PyQGIS API has some functions that respect the buffer protocol.

You can have a look at the following video to see this component integrated in an existing plugin.

Visualization of borehole logs withing QGIS

It is distributed as usual under an open source license and the code repository can be found on GitHub.

Do not hesitate to contact us ( [email protected] ) if you are interested in any enhancements around this component.

New QGIS geometry editing tool: trim/extend

Oslandia is continuing to improve QGIS to offer the draftsman the same experience as with the computer-aided design (CAD) software on the market.
Today we introduce you to the “trim/extend” tool.

As its name suggests, this tool, well known to CAD draftsmen, allows you to trim / shorten or extend segments.

However, where CAD tools impose some limitations – such as using on the end of polyline segments and only on 2D objects – we wanted to go further in QGIS:
– you can modify any segment of a line or polygon geometry and not just its end;
– you can of course modify multi geometries;
– you can hang on to 3D points.

Click to view our demo video.

The tool will be available in version 3.6, but you can use it now in the development version, via the advanced Windows installer for example

This tool was developed thanks to the funding of the Megève Town Hall, which we would like to thank in particular.

It is still possible to improve the tool by offering:

– multiple selection of limits and entities to be modified;
– an option to modify the two selected entities up to their intersection point;
– better support for curved geometries in the future;
– to add topology support when using the tool.

Would you like to participate in its improvement or the development of QGIS?

Need a study or support to move from CAD to GIS?

Do not hesitate to contact us.

Follow us also on twitter or Linkedin to be informed about our new CAD tools in QGIS.

QGIS Server 3 : OGC Certification work for WFS 1.1.0

QGIS Server is an open source OGC data server which uses QGIS engine as backend. It becomes really awesome because a simple desktop qgis project file can be rendered as web services with exactly the same rendering, and without any mapfile or xml coding by hand.

QGIS Server provides a way to serve OGC web services like WMS, WCS, WFS and WMTS resources from a QGIS project, but can also extend services like GetPrint which takes advantage of QGIS’s map composer power to generate high quality PDF outputs.

Since the 3.0.2 version, QGIS Server is certified as an official OGC reference implementation for WMS 1.3.0 and reports are generated in a daily continuous integration to avoid regressions.

 

Thus, the next step was naturally to take a look at the WFS 1.1.0 thanks to the support of the QGIS Grant Program

Side note, this Grant program is made possible thanks to your direct sponsoring and micro-donations to QGIS.org.

TEAM Engine test suite for WFS 1.1.0

We use a tool provided by the OGC Compliance Program to run dedicated tests on the server : Teamengine (Test, Evaluation, And Measurement Engine).

Test suites are available through a web interface. However, for the needs of continuous integration, these tests have to be run without user interaction. In the case of WMS 1.3.0, nothing more easy than using the REST API:

$ curl "http://localhost:8081/teamengine/rest/suites/wms/1.20/run?queryable=queryable&basic=basic&capabilities-url=http://172.17.0.2/qgisserver?REQUEST=GetCapabilities%26SERVICE=WMS%26VERSION=1.3.0%26MAP=/data/teamengine_wms_130.qgs

 

However, the WFS 1.1.0 test suite does not provide a REST API and makes the situation less straightforward. We switched to using TEAM Engine directly from command line:

$ cd te_base
$ ./bin/unix/test.sh -source=wfs/1.1.0/ctl/main.ctl -form=params.xml

The params.xml file allows to configure underlying tests. In this particular case, the GetCapabilities URL of the QGIS Server to test is given.  Results are available thanks to the viewlog.sh shell script:

$ ./bin/unix/viewlog.sh -logdir=te_base/users/root/ -session=s0001
Test wfs:wfs-main type Mandatory (s0001) Failed (InheritedFailure)
   Test wfs:readiness-tests type Mandatory (s0001/d68e38807_1) Failed (InheritedFailure)
      Test ctl:SchematronValidatingParser type Mandatory (s0001/d68e38807_1/d68e588_1) Failed
      Test wfs:basic-main type Mandatory (s0001/d68e38807_1/d68e636_1) Failed (InheritedFailure)
         Test wfs:run-GetCapabilities-basic-cc-GET type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1) Failed (InheritedFailure)
            Test wfs:wfs-1.1.0-Basic-GetCapabilities-tc1 type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1095_1) Passed
               Test ctl:assert-xpath type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1095_1/d68e1234_1) Passed
            Test wfs:wfs-1.1.0-Basic-GetCapabilities-tc2 type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1100_1) Passed
               Test ctl:assert-xpath type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1100_1/d68e1305_1) Passed
            Test wfs:wfs-1.1.0-Basic-GetCapabilities-tc3 type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1105_1) Passed
               Test ctl:assert-xpath type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1105_1/d68e1558_1) Passed
               Test ctl:assert-xpath type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1105_1/d68e1582_1) Passed
               Test ctl:assert-xpath type Mandatory (s0001/d68e38807_1/d68e636_1/d68e28810_1/d68e1105_1/d68e1606_1) Passed
...
...

Finally, a Python script has been written to read these logs and generate HTML report easily readable. Thanks to our QGIS-Server-CertifSuite, providing the continuous integration infrastructure with Docker images, these reports are also generated daily.

Bugfix and Conclusion

First results were clear: a lot of work is necessary to have a QGIS Server certified for WFS 1.1.0!

We started fixing the issues one by one:

And now we have a much better support than 6 months ago

However, some work still need to be done to finally obtain the OGC certification for WFS 1.1.0. To be continued!

Please contact us if you want QGIS server to become a reference implementation for all OGC service !

French Ministry in charge of the ecological transition selected Oslandia

Ministère de la Transition Écologique et Solidaire Logo

The French ministry of the ecological transition  selected Oslandia for two of the three packages of its call for tender procedure dedicated to geomatic tools.  We are very proud to dedicate our team to one of the strongest support of geomatics and Open Source in France for the next 2 to 4 years.

First package is dedicated to expert studies covering spatial databases, software, components, protocols, norms and standards in the geomatics fields.

Second package provides support and development for QGIS,  the spatial cartridge PostGIS of PostgreSQL and their components. We are really happy to continue a common work already engaged in the previous contract.

This is again another proof that we face a major tendency of open source investment, where geomatics components are currently among the most dynamic and strongest open source projects. This is also a confirmation that actors are now integrating deeply the economic rationale of open source contribution inside their politics.

Back to Top

Sustaining Members