Related Plugins and Tags

QGIS Planet

What went on at FOSS4G 2015?

Granted, I could only follow FOSS4G 2015 remotely on social media but what I saw was quite impressive and will keep me busy exploring for quite a while. Here’s my personal pick of this year’s highlights which I’d like to share with you:

QGIS

Marco Hugentobler at FOSS4G 2015 (Photo by Jody Garnett)

Marco Hugentobler at FOSS4G 2015 (Photo by Jody Garnett)

The Sourcepole team has been particularly busy with four presentations which you can find on their blog.

Marco Hugentobler’s keynote is just great, summing up the history of the QGIS project and discussing success factor for open source projects.

Marco also gave a second presentation on new QGIS features for power users, including live layer effects, new geometry support (curves!), and geometry checker.

There has also been an update to QTiles plugin by NextGIS this week.

If you’re a bit more into webmapping, Victor Olaya presented the Web App Builder he’s been developing at Boundless. Web App Builder should appear in the official plugin repo soon.

Preview of Web App Builder from Victors presentation

Preview of Web App Builder from Victors presentation

Geocoding

If you work with messy, real-world data, you’ve most certainly been fighting with geocoding services, trying to make the best of a bunch of address lists. The Python Geocoder library promises to make dealing with geocoding services such as Google, Bing, OSM & many easier than ever before.

Let me know if you tried it.

Mobmap Visualizations

Mobmap – or more specifically Mobmap2 – is an extension for Chrome which offers visualization and analysis capabilities for trajectory data. I haven’t tried it yet but their presentation certainly looks very interesting:


QGIS on the rise with journalists

If you are following QGIS on Twitter you’ve probably noticed the increasing number of tweets by journalists using QGIS.

For example this map in the Financial Times by Hannah Dormido

or this one with overview maps and three different levels of details

or this map with semi-transparent label backgrounds and nice flag images

or even Time Manager animations by raoulranoa in the Los Angeles Times

I think this is a great development and a sign of how wide-spread QGIS usage is today.

If you know of any other examples or if you are a journalist using QGIS yourself, I’d love to see more!


Video tutorial: animated heatmaps with QGIS

Do you like the QGIS heatmap functionality? Did you know that QGIS can also create animated heatmaps?

The following video tutorial shows all necessary steps. To reproduce it, you can get the sample data from my Time Manager workshop at #QGIS2015.


Time Manager workshop at #QGIS2015

Today was the final day of #QGIS2015 the first joint QGIS conference and developer meeting. I had the pleasure to meet Time Manager co-developer Karolina Alexiou aka carolinux in person and give a talk including a hands-on workshop on Time Manager together. Time Manager makes it possible to explore spatio-temporal data by creating animations directly in QGIS.

The talk presents QGIS visualization tools with a focus on efficient use of layer styling to both explore and present spatial data. Examples include the recently added heatmap style as well as sophisticated rule-based and data-defined styles. The focus of this presentation is exploring and presenting spatio-temporal data using the Time Manager plugin. A special treat are time-dependent styles using expression-based styling which access the current Time Manager timestamp.

To download the example data and QGIS projects download Time_Manager_Examples.zip.


Trajectory animations with fadeout effect

Today’s post is a short tutorial for creating trajectory animations with a fadeout effect using QGIS Time Manager. This is the result we are aiming for:

The animation shows the current movement in pink which fades out and leaves behind green traces of the trajectories.

About the data

GeoLife GPS Trajectories were collected within the (Microsoft Research Asia) Geolife project by 182 users in a period of over three years (from April 2007 to August 2012). [1,2,3] The GeoLife GPS Trajectories download contains many text files organized in multiple directories. The data files are basically CSVs with 6 lines of header information. They contain the following fields:

Field 1: Latitude in decimal degrees.
Field 2: Longitude in decimal degrees.
Field 3: All set to 0 for this dataset.
Field 4: Altitude in feet (-777 if not valid).
Field 5: Date – number of days (with fractional part) that have passed since 12/30/1899.
Field 6: Date as a string.
Field 7: Time as a string.

Data prep: PostGIS

Since any kind of GIS operation on text files will be quite inefficient, I decided to load the data into a PostGIS database. This table of millions of GPS points can then be sliced into appropriate chunks for exploration, for example, a day in Beijing:

CREATE MATERIALIZED VIEW geolife.beijing 
AS SELECT trajectories.id,
    trajectories.t_datetime,
    trajectories.t_datetime + interval '1 day' as t_to_datetime,
    trajectories.geom,
    trajectories.oid
   FROM geolife.trajectories
   WHERE st_dwithin(trajectories.geom,
           st_setsrid(
             st_makepoint(116.3974589, 
                           39.9388838), 
             4326), 
           0.1) 
   AND trajectories.t_datetime >= '2008-11-11 00:00:00'
   AND trajectories.t_datetime < '2008-11-12 00:00:00'
WITH DATA

Trajectory viz: a fadeout effect for point markers

The idea behind this visualization is to show both the current movement as well as the history of the trajectories. This can be achieved with a fadeout effect which leaves behind traces of past movement while the most recent positions are highlighted to stand out.

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.

This effect can be created using a Single Symbol renderer with a marker symbol with two symbol layers: one layer serves as the highlights layer (pink) while the second layer represents the traces (green) which linger after the highlights disappear. Feature blending is used to achieve the desired effect for overlapping markers.

Screenshot 2015-05-06 23.52.40

The highlights layer has two expression-based properties: color and size. The color fades to white and the point size shrinks as the point ages. The age can be computed by comparing the point’s t_datetime timestamp to the Time Manager animation time $animation_datetime.

This expression creates the color fading effect:

color_hsv(  
  311,
  scale_exp( 
    minute(age($animation_datetime,"t_datetime")),
    0,60,
    100,0,
    0.2
  ),
  90
)

and this expression makes the point size shrink:

scale_exp( 
  minute(age($animation_datetime,"t_datetime")),
  0,60,
  24,0,
  0.2
)

Outlook

I’m currently preparing this and a couple of other examples for my Time Manager workshop at the upcoming 1st QGIS conference in Nødebo. The workshop materials will be made available online afterwards.

Literature

[1] Yu Zheng, Lizhu Zhang, Xing Xie, Wei-Ying Ma. Mining interesting locations and travel sequences from GPS trajectories. In Proceedings of International conference on World Wild Web (WWW 2009), Madrid Spain. ACM Press: 791-800.
[2] Yu Zheng, Quannan Li, Yukun Chen, Xing Xie, Wei-Ying Ma. Understanding Mobility Based on GPS Data. In Proceedings of ACM conference on Ubiquitous Computing (UbiComp 2008), Seoul, Korea. ACM Press: 312-321.
[3] Yu Zheng, Xing Xie, Wei-Ying Ma, GeoLife: A Collaborative Social Networking Service among User, location and trajectory. Invited paper, in IEEE Data Engineering Bulletin. 33, 2, 2010, pp. 32-40.


Experiments with Conway’s Game of Life

This experiment is motivated by a discussion I had with Dr. Claus Rinner about introducing students to GIS concepts using Conway’s Game of Life. Conway’s Game of Life is a popular example to demonstrate cellular automata. Based on an input grid of “alive” and “dead” cells, new cell values are computed on each iteration based on four simple rules for the cell and its 8 neighbors:

  1. Any live cell with fewer than two live neighbours dies, as if caused by under-population.
  2. Any live cell with two or three live neighbours lives on to the next generation.
  3. Any live cell with more than three live neighbours dies, as if by overcrowding.
  4. Any dead cell with exactly three live neighbours becomes a live cell, as if by reproduction.

(Source: Wikipedia – Conway’s Game of Life)

Based on these simple rules, effects like the following “glider gun” can be achieved:

Gospers glider gun.gif
Gospers glider gun” by KieffOwn work. Licensed under CC BY-SA 3.0 via Wikimedia Commons.

There are some Game of Life implementations for GIS out there, e.g. scripts for ArcGIS or a module for SAGA. Both of these examples are raster-based. Since I couldn’t find any examples of raster manipulation like this in pyQGIS, I decided to instead implement a vector version: a Processing script which receives an input grid of cells and outputs the next iteration based on the rules of Game of Life. In the following screencast, you can see the Processing script being called repeatedly by a script from the Python console:

So far, it’s a quick and dirty first implementation. To make it more smooth, I’m considering adding spatial indexing and using memory layers instead of having Processing create a bunch of Shapefiles.

It would also be interesting to see a raster version done in PyQGIS. Please leave a comment if you have any ideas how this could be achieved.


Visualizing direction-dependent values

When mapping flows or other values which relate to a certain direction, styling these layers gets interesting. I faced the same challenge when mapping direction-dependent error values. Neighboring cell pairs were connected by two lines, one in each direction, with an associated error value. This is what I came up with:

srtm_errors_1200px

Each line is drawn with an offset to the right. The size of the offset depends on the width of the line which in turn depends on the size of the error. You can see the data-defined style properties here:

directed_error_style

To indicate the direction, I added a marker line with one > marker at the center. This marker line also got assigned the same offset to match the colored line bellow. I’m quite happy with how these turned out and would love to hear about your approaches to this issue.

srtm_errors_detail

These figures are part of a recent publication with my AIT colleagues: A. Graser, J. Asamer, M. Dragaschnig: “How to Reduce Range Anxiety? The Impact of Digital Elevation Model Quality on Energy Estimates for Electric Vehicles” (2014).


3D viz with QGIS & three.js

If you are looking for a tool to easily create 3D visualizations of your geodata, look no further! Qgis2threejs is a plugin by Minoru Akagi which exports terrain data combined with the map canvas image and optional vector data to an html file which can be viewed in 3D in any web browser which supports WebGL. To do that, this plugin uses the Three.js library.

This is the result of my first experiments with Qgis2threejs. In the following sections, I will show the steps to reproduce it.

Türkenschanzpark, Vienna

click for the interactive version (requires WebGL-capable browser)

1. The data

The building blocks of this visualization are:

  • elevation data and the hillshade derived from this data
  • a base map (WMTS from basemap.at in my case)
  • OSM building data provided by Geofabrik and
  • tree data from the city of Vienna

Load all datasets into QGIS.

2. Preparing the map

Qgis2threejs will overlay the map (as rendered in the QGIS map area) on top of the elevation model. You can combine any number of layers to create your map. I just loaded a basemap.at WMTS and a hillshade layer. To add a nice tree shadow effect, I also added the tree layer (dark grey, 50% transparency, multiply blending).

tuerkenschanzpark_map

3. Preparing the vector features

The vector features in the visualization are buildings and trees. The buildings are based on an OSM building layer. The trees are create from two point layers: one point layer to create the tree trunks (cylinder shape) and a duplicate of this point layer to create the tree crowns (sphere shape).

Load the data and choose the desired fill colors.

4. Using Qgis2threejs

Now we can start Qgis2threejs. The first tab is used to configure the terrain. Just pick the correct elevation data layer. I didn’t modify any of the other default settings.

qgis2threejs_dem

The second tab provides the settings for the vector data. As mentioned in the previous section, the trees are created from two point layers and the buildings are based on a polygon layer. The tree crowns are spheres with a radius size 3 and a z value of 5 above the surface. The tree trunks are cylinders. Finally, the buildings have a height of 10.

qgis2threejs_vector

That’s it! Just press “run” and wait. When the export is finished, your default browser (or a different one, if you specify another one in the plugin settings) will open automatically and display the results.
The visualization is interactive. You can tilt the visualization using the left mouse button, pan using the right mouse button, and zoom using the mouse wheel. I found that Firefox used around 1.6 GB of RAM to render this example.

5. Share your visualization

In the browser window, you will see where Qgis2threejs stored the html and associated Javascript files. To share your visualization, you just need to copy these files onto a webserver.

I would love to see what you come up with. Please share a link in the comments.


Data-defined properties in QGIS 2.0

In QGIS 2.0, the old “size scale” field has been replaced by data-defined properties which enable us to control many more properties than jut size and rotation. One of the often requested features – for example – is the possibility for data-defined colors:

datadefinedproperties

Today’s example map visualizes a dataset of known meteorite landings published on http://visualizing.org/datasets/meteorite-landings. I didn’t clean the data, so there is quite a bunch of meteorites at 0/0.

To create the map, I used QGIS 2.0 feature blending mode “multiply” as well as data-defined size based on meteorite mass:

meteorites1

Background oceans and graticule by NaturalEarthData.


Dataviz with OpenSource Tools

Today, I’ve finished my submission for the Hubway Data Visualization Challenge. All parts of the resulting dataviz were created using open source tools. My toolbox for this work contains: QGIS, Spatialite, Inkscape, Gimp and Open Office Calc. To see the complete submission and read more about it, check the project page.


Mapping Hubway Station Stats

Today, I’ve been working on some station statistics. From the trip data, I calculated incoming and outgoing trips per station as well as the station’s first day of operations. Combining this information makes it possible to calculate the average day’s “bike balance”. A balanced station has the same number of incoming and outgoing trips while an unbalanced station will either run out of bikes or empty slots for returns.

I’ve published the resulting station map on QGIS Cloud (http://qgiscloud.com/anitagraser/hubway_cloud1) where you can have a look at the bike balance values.

Additionally, I’ve created a mashup in Leaflet pulling together background tiles from Stamen and the cloud-hosted WMS for better orientation:


Exploring Hubway’s Data II

Today, I’ve been experimenting with a new way to visualize origin-destination pairs (ODs). The following image shows my first results:

The ideas was to add a notion of direction as well as uncertainty. The “flower petals” have a pointed origin and grow wider towards the middle. (Looking at the final result, they should probably go much narrower towards the end again.) The area covered by the petals is a simple approximation of where I’d expect the bike routes without performing any routing.

To get there, I reprojected the connection lines to EPSG:3857 and calculated connection length and line orientation using QGIS Field Calculator $length operator and the bearing formula given in QGIS Wiki:

(atan((xat(-1)-xat(0))/(yat(-1)-yat(0)))) * 180/3.14159 + (180 *(((yat(-1)-yat(0)) < 0) + (((xat(-1)-xat(0)) < 0 AND (yat(-1) - yat(0)) >0)*2)))

For the style, I created a new “flower petal” SVG symbol in Inkscape and styled it with varying transparency values: Rare connections are more transparent than popular ones. This style is applied to the connection start points. Using the advanced options “size scale” and “rotation”, it is possible to rotate the petals into the right direction as well as scale them using the previously calculated values for connection length and orientation.

Update

While the above example uses pretty wide petals this one is done with a much narrower petal. I think it’s more appropriate for the data at hand:

Most of the connections are clearly heading south east, across Charles River, except for that group of connections pointing the opposite direction, to Harvard Square.


Exploring Hubway’s Data I

Hubway is a bike sharing system in Boston and they are currently hosting a data visualization challenge. What a great chance to play with some real-world data!

To get started, I loaded both station Shapefile and trip CSV into a new Spatialite database. The GUI is really helpful here – everything is done in a few clicks. Afterwards, I decided to look into which station combinations are most popular. The following SQL script creates my connections table:

create table connections (
start_station_id INTEGER,
end_station_id INTEGER,
count INTEGER,
Geometry GEOMETRY);


insert into connections select 
start_station_id, 
end_station_id, 
count(*) as count, 
LineFromText('LINESTRING('||X(a.Geometry)||' '||Y(a.Geometry)||','
                          ||X(b.Geometry)||' '||Y(b.Geometry)||')') as Geometry
 from trips, stations a, stations b
where start_station_id = a.ID 
and end_station_id = b.ID
and a.ID != b.ID
and a.ID is not NULL
and b.ID is not NULL
group by start_station_id, end_station_id;

(Note: This is for Spatialite 2.4, so there is no MakeLine() method. Use MakeLine if you are using 3.0.)

For a first impression, I decided to map popular connections with more than one hundred entries. Wider lines mean more entries. The points show the station locations and they are color coded by starting letter. (I’m not yet sure if they mean anything. They seem to form groups.)

Some of the stations don’t seem to have any strong connections at all. Others are rather busy. The city center and the dark blue axis pointing west seem most popular.

I’m really looking forward to what everyone else will be finding in this dataset.


Exploring Mobility Data Using Time Manager

Data from various vehicles is collected for many purposes in cities worldwide. To get a feeling for just how much data is available, I created the following video using QGIS Time Manager which has been shown at the Austrian Museum of Applied Arts “MADE 4 YOU – Design for Change”. It shows one hour of taxi tracks in the city of Vienna:

If you like the video, please go to http://www.ertico.com/2012-its-video-competition-open-vote and vote for it in the category “Videos directed at the general public”.


Space-Time Cubes – Exploring Twitter Streams III

This post continues my quest of exploring the spatial dimension of Twitter streams. I wanted to try one of the classic spatio-temporal visualization methods: Space-time cubes where the vertical axis represents time while the other two map space. Like the two previous examples, this visualization is written in pyprocessing, a Python port of the popular processing environment.

This space-time cube shows twitter trajectories that contain at least one tweet in New York Times Square. The 24-hour day starts at the bottom of the cube and continues to the top. Trajectories are colored based on the time stamp of their start tweet.

Additionally, all trajectories are also drawn in context of the coastline (data: OpenStreetMap) on the bottom of the cube.

While there doesn’t seem to be much going on in the early morning hours, we can see quite a busy coming and going during the afternoon and evening. From the bunch of vertical lines over Times Square, we can also assume that some of our tweet authors spent a considerable time at and near Times Square.

I’ve also created an animated version. Again, I recommend to watch it in HD.


Pyprocessing for 3D Animations

I’ve been looking into the 3D capabilities of Pyprocessing for the creation of animated space-time cubes.

There are subtle differences between Processing and Pyprocessing. Processing is documented pretty well but I prefer Python over Java any time. So here is my port of the Processing “Cubes within Cube” example as a reference for how 3D animations are done in Pyprocessing.


(You can watch the animation live on the Processing site.)

from pyprocessing import *
from random import random

cubies = 20
c = [0]*cubies
quadBG = [[None]*6]*cubies

# Controls cubie's movement
x = [0.0]*cubies
y = [0.0]*cubies
z = [0.0]*cubies
xSpeed = [0.0]*cubies
ySpeed = [0.0]*cubies
zSpeed = [0.0]*cubies

# Controls cubie's rotation
xRot = [0.0]*cubies
yRot = [0.0]*cubies
zRot = [0.0]*cubies

stage = None

# Size of external cube
bounds = 300.0

def setup():
  size(640, 360);
  
  for i in range(0,cubies):
    # Each cube face has a random color component
    quadBG[i][0] = color(0)
    quadBG[i][1] = color(51)
    quadBG[i][2] = color(102)
    quadBG[i][3] = color(153)
    quadBG[i][4] = color(204)
    quadBG[i][5] = color(255)

    # Cubies are randomly sized
    cubieSize = random()*10+5
    c[i] =  Cube(cubieSize, cubieSize, cubieSize)

    # Initialize cubie's position, speed and rotation
    x[i] = 0.0
    y[i] = 0.0
    z[i] = 0.0

    xSpeed[i] = random()*4-2
    ySpeed[i] = random()*4-2
    zSpeed[i] = random()*4-2

    xRot[i] = random()*60+40
    yRot[i] = random()*60+40
    zRot[i] = random()*60+40
  

def draw():
  background(50)
  lights()
  
  # Center in display window
  translate(width/2, height/2, -130)
  
  # Outer transparent cube
  noFill()
  
  # Rotate everything, including external large cube
  rotateX(frame.count * 0.001)
  rotateY(frame.count * 0.002)
  rotateZ(frame.count * 0.001)
  stroke(255)
  
  # Draw external large cube
  stage =  Cube(bounds, bounds, bounds);
  stage.create()

  # Move and rotate cubies
  for i in range(0,cubies):
    pushMatrix()
    translate(x[i], y[i], z[i])
    rotateX(frame.count*PI/xRot[i])
    rotateY(frame.count*PI/yRot[i])
    rotateX(frame.count*PI/zRot[i])
    noStroke()
    c[i].create(quadBG[i])
    x[i] += xSpeed[i]
    y[i] += ySpeed[i]
    z[i] += zSpeed[i]
    popMatrix()

    # Draw lines connecting cubbies
    stroke(0)
    if i < cubies-1:
      line(x[i], y[i], z[i], x[i+1], y[i+1], z[i+1])

    # Check wall collisions
    if x[i] > bounds/2 or x[i] < -bounds/2:
      xSpeed[i]*=-1
    
    if y[i] > bounds/2 or y[i] < -bounds/2:
      ySpeed[i]*=-1
    
    if z[i] > bounds/2 or z[i] < -bounds/2:
      zSpeed[i]*=-1
    


# Custom Cube Class

class Cube():
  def __init__(self,w,h,d):
    self.vertices = [0]*24
    self.w = w;
    self.h = h;
    self.d = d;

    # cube composed of 6 quads
    #front
    self.vertices[0] =  PVector(-w/2,-h/2,d/2)
    self.vertices[1] =  PVector(w/2,-h/2,d/2)
    self.vertices[2] =  PVector(w/2,h/2,d/2)
    self.vertices[3] =  PVector(-w/2,h/2,d/2)
    #left
    self.vertices[4] =  PVector(-w/2,-h/2,d/2)
    self.vertices[5] =  PVector(-w/2,-h/2,-d/2)
    self.vertices[6] =  PVector(-w/2,h/2,-d/2)
    self.vertices[7] =  PVector(-w/2,h/2,d/2)
    #right
    self.vertices[8] =  PVector(w/2,-h/2,d/2)
    self.vertices[9] =  PVector(w/2,-h/2,-d/2)
    self.vertices[10] =  PVector(w/2,h/2,-d/2)
    self.vertices[11] =  PVector(w/2,h/2,d/2)
    #back
    self.vertices[12] =  PVector(-w/2,-h/2,-d/2)
    self.vertices[13] =  PVector(w/2,-h/2,-d/2)
    self.vertices[14] =  PVector(w/2,h/2,-d/2)
    self.vertices[15] =  PVector(-w/2,h/2,-d/2)
    #top
    self.vertices[16] =  PVector(-w/2,-h/2,d/2)
    self.vertices[17] =  PVector(-w/2,-h/2,-d/2)
    self.vertices[18] =  PVector(w/2,-h/2,-d/2)
    self.vertices[19] =  PVector(w/2,-h/2,d/2)
    #bottom
    self.vertices[20] =  PVector(-w/2,h/2,d/2)
    self.vertices[21] =  PVector(-w/2,h/2,-d/2)
    self.vertices[22] =  PVector(w/2,h/2,-d/2)
    self.vertices[23] =  PVector(w/2,h/2,d/2)
  
  def create(self,quadBG=None):
    # Draw cube
    for i in range(0,6):
      if quadBG:
          fill(quadBG[i])
      beginShape(QUADS)
      for j in range(0,4):
        vertex(self.vertices[j+4*i].x, self.vertices[j+4*i].y, self.vertices[j+4*i].z)
      endShape()

run()


A Visual Exploration of Twitter Streams II

After my first shot at analyzing Twitter data visually I received a lot of great feedback. Thank you!

For my new attempt, I worked on incorporating your feedback such as: filter unrealistic location changes, show connections “grow” instead of just popping up and zoom to an interesting location. The new animation therefore focuses on Manhattan – one of the places with reasonably high geotweet coverage.

The background is based on OpenStreetMap coastline data which I downloaded using QGIS OSM plugin and rendered in pyprocessing together with the geotweets. To really see what’s going on, switch to HD resolution and full screen:

It’s pretty much work-in-progress. The animation shows similar chaotic patterns seen in other’s attempts at animating tweets. To me, the distribution of tweets looks reasonable and many of the connection lines seem to actually coincide with the bridges spanning to and from Manhattan.

This work is an attempt at discovering the potential of Twitter data and at the same time learning some pyprocessing which will certainly be useful for many future tasks. The next logical step seems to be to add information about interactions between users and/or to look at the message content. Another interesting task would be to add interactivity to the visualization.


A Visual Exploration of Twitter Streams

Twitter streams are curious things, especially the spatial data part. I’ve been using Tweepy to collect tweets from the public timeline and what did I discover? Tweets can have up to three different spatial references: “coordinates”, “geo” and “place”. I’ll still have to do some more reading on how to interpret these different attributes.

For now, I have been using “coordinates” to explore the contents of a stream which was collected over a period of five hours using

stream.filter(follow=None,locations=(-180,-90,180,90))

for global coverage. In the video, each georeferenced tweet produces a new dot on the map and if the user’s coordinates change, a blue arrow is drawn:

While pretty, these long blue arrows seem rather suspicious. I’ve only been monitoring the stream for around five hours. Any cross-Atlantic would take longer than that. I’m either misinterpreting the tweets or these coordinates are fake. Seems like it is time to dive deeper into the data.


Glowing Hot Maps – QGIS Meets Gimp

Waiting time is over, Gimp 2.8 is finally here. That is reason enough to take it for a quick test run!

How about a new look for the QGIS user map?

This “glowing hot” map was made using the Gimp filter of the same name:

For the user point layer, I selected a simple point style with high transparency and separately exported land and user points from print composer.

user points as exported from QGIS

In Gimp, I applied the “glowing hot” filter to the user points and combined the layers. The trick here is to first use “Color to alpha” on the user point layer and turn black to transparent. This way, the “glowing hot” filter will only be applied to the remaining points.

Gimp 2.8 RC1 is close enough to the previous version to get comfortable fast. I like the single-window mode even if it’s hard to tell which part of the GUI has the focus sometimes.

Open source GIS and image editing for a perfect work flow.


Mapping the Night

Most maps of night time lights show the land masses lit brightly by city lights. But the oceans are not as dark as these maps suggest. NOAA/NGDC datasets available through edenextdata.com show very bright spots in the North Sea:

Night time lights trace the coast but illuminate the sea too.

The dataset description mentions that the sensors pick up moonlit clouds, lights from human settlements, fires, gas flares, heavily lit fishing boats, lightning and the aurora. So might these spots be fishing boats?


Back to Top

Sustaining Members