Related Plugins and Tags

QGIS Planet

Towards better gradients

Interesting developments going on if you like creating your own gradients. After all, that’s not as easy as it might initially seem, as Gregor Aisch describes in his post “Mastering Multi-hued Color Scales with Chroma.js”:

The issues with simple color interpolations, which include nonuniform changes in lightness between classes, also haunt us in cartography. Just have a look at the map and legend on the left-hand side, which has been created using a normal custom QGIS gradient with colors ranging from black to red, yellow and finally white. We end up with three classes in yellow which are nearly impossible to tell apart:

comparing_ramps

For comparison, on the right side, I’ve used Gregor’s corrected color ramp, which ensures that lightness changes evenly from one class to the next.

Wouldn’t it be great if the built-in gradient tool in QGIS could correct for lightness? Too bad the current dialog is not that great:

My first reaction therefore was to write a short script to import gradients from Gregor’s Chroma.js Color Scale Helper into QGIS:

But we’ll probably have a much better solution in QGIS soon since Nyall Dawson has picked up the idea and is already working on a completely new version of the gradient tool. You can see a demo of the current work in progress here:

I’m really looking forward to trying this out once it hits master!


What went on at FOSS4G 2015?

Granted, I could only follow FOSS4G 2015 remotely on social media but what I saw was quite impressive and will keep me busy exploring for quite a while. Here’s my personal pick of this year’s highlights which I’d like to share with you:

QGIS

Marco Hugentobler at FOSS4G 2015 (Photo by Jody Garnett)

Marco Hugentobler at FOSS4G 2015 (Photo by Jody Garnett)

The Sourcepole team has been particularly busy with four presentations which you can find on their blog.

Marco Hugentobler’s keynote is just great, summing up the history of the QGIS project and discussing success factor for open source projects.

Marco also gave a second presentation on new QGIS features for power users, including live layer effects, new geometry support (curves!), and geometry checker.

There has also been an update to QTiles plugin by NextGIS this week.

If you’re a bit more into webmapping, Victor Olaya presented the Web App Builder he’s been developing at Boundless. Web App Builder should appear in the official plugin repo soon.

Preview of Web App Builder from Victors presentation

Preview of Web App Builder from Victors presentation

Geocoding

If you work with messy, real-world data, you’ve most certainly been fighting with geocoding services, trying to make the best of a bunch of address lists. The Python Geocoder library promises to make dealing with geocoding services such as Google, Bing, OSM & many easier than ever before.

Let me know if you tried it.

Mobmap Visualizations

Mobmap – or more specifically Mobmap2 – is an extension for Chrome which offers visualization and analysis capabilities for trajectory data. I haven’t tried it yet but their presentation certainly looks very interesting:


QGIS on the rise with journalists

If you are following QGIS on Twitter you’ve probably noticed the increasing number of tweets by journalists using QGIS.

For example this map in the Financial Times by Hannah Dormido

or this one with overview maps and three different levels of details

or this map with semi-transparent label backgrounds and nice flag images

or even Time Manager animations by raoulranoa in the Los Angeles Times

I think this is a great development and a sign of how wide-spread QGIS usage is today.

If you know of any other examples or if you are a journalist using QGIS yourself, I’d love to see more!


Video tutorial: animated heatmaps with QGIS

Do you like the QGIS heatmap functionality? Did you know that QGIS can also create animated heatmaps?

The following video tutorial shows all necessary steps. To reproduce it, you can get the sample data from my Time Manager workshop at #QGIS2015.


Time Manager workshop at #QGIS2015

Today was the final day of #QGIS2015 the first joint QGIS conference and developer meeting. I had the pleasure to meet Time Manager co-developer Karolina Alexiou aka carolinux in person and give a talk including a hands-on workshop on Time Manager together. Time Manager makes it possible to explore spatio-temporal data by creating animations directly in QGIS.

The talk presents QGIS visualization tools with a focus on efficient use of layer styling to both explore and present spatial data. Examples include the recently added heatmap style as well as sophisticated rule-based and data-defined styles. The focus of this presentation is exploring and presenting spatio-temporal data using the Time Manager plugin. A special treat are time-dependent styles using expression-based styling which access the current Time Manager timestamp.

To download the example data and QGIS projects download Time_Manager_Examples.zip.


Trajectory animations with fadeout effect

Today’s post is a short tutorial for creating trajectory animations with a fadeout effect using QGIS Time Manager. This is the result we are aiming for:

The animation shows the current movement in pink which fades out and leaves behind green traces of the trajectories.

About the data

GeoLife GPS Trajectories were collected within the (Microsoft Research Asia) Geolife project by 182 users in a period of over three years (from April 2007 to August 2012). [1,2,3] The GeoLife GPS Trajectories download contains many text files organized in multiple directories. The data files are basically CSVs with 6 lines of header information. They contain the following fields:

Field 1: Latitude in decimal degrees.
Field 2: Longitude in decimal degrees.
Field 3: All set to 0 for this dataset.
Field 4: Altitude in feet (-777 if not valid).
Field 5: Date – number of days (with fractional part) that have passed since 12/30/1899.
Field 6: Date as a string.
Field 7: Time as a string.

Data prep: PostGIS

Since any kind of GIS operation on text files will be quite inefficient, I decided to load the data into a PostGIS database. This table of millions of GPS points can then be sliced into appropriate chunks for exploration, for example, a day in Beijing:

CREATE MATERIALIZED VIEW geolife.beijing 
AS SELECT trajectories.id,
    trajectories.t_datetime,
    trajectories.t_datetime + interval '1 day' as t_to_datetime,
    trajectories.geom,
    trajectories.oid
   FROM geolife.trajectories
   WHERE st_dwithin(trajectories.geom,
           st_setsrid(
             st_makepoint(116.3974589, 
                           39.9388838), 
             4326), 
           0.1) 
   AND trajectories.t_datetime >= '2008-11-11 00:00:00'
   AND trajectories.t_datetime < '2008-11-12 00:00:00'
WITH DATA

Trajectory viz: a fadeout effect for point markers

The idea behind this visualization is to show both the current movement as well as the history of the trajectories. This can be achieved with a fadeout effect which leaves behind traces of past movement while the most recent positions are highlighted to stand out.

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL.

This effect can be created using a Single Symbol renderer with a marker symbol with two symbol layers: one layer serves as the highlights layer (pink) while the second layer represents the traces (green) which linger after the highlights disappear. Feature blending is used to achieve the desired effect for overlapping markers.

Screenshot 2015-05-06 23.52.40

The highlights layer has two expression-based properties: color and size. The color fades to white and the point size shrinks as the point ages. The age can be computed by comparing the point’s t_datetime timestamp to the Time Manager animation time $animation_datetime.

This expression creates the color fading effect:

color_hsv(  
  311,
  scale_exp( 
    minute(age($animation_datetime,"t_datetime")),
    0,60,
    100,0,
    0.2
  ),
  90
)

and this expression makes the point size shrink:

scale_exp( 
  minute(age($animation_datetime,"t_datetime")),
  0,60,
  24,0,
  0.2
)

Outlook

I’m currently preparing this and a couple of other examples for my Time Manager workshop at the upcoming 1st QGIS conference in Nødebo. The workshop materials will be made available online afterwards.

Literature

[1] Yu Zheng, Lizhu Zhang, Xing Xie, Wei-Ying Ma. Mining interesting locations and travel sequences from GPS trajectories. In Proceedings of International conference on World Wild Web (WWW 2009), Madrid Spain. ACM Press: 791-800.
[2] Yu Zheng, Quannan Li, Yukun Chen, Xing Xie, Wei-Ying Ma. Understanding Mobility Based on GPS Data. In Proceedings of ACM conference on Ubiquitous Computing (UbiComp 2008), Seoul, Korea. ACM Press: 312-321.
[3] Yu Zheng, Xing Xie, Wei-Ying Ma, GeoLife: A Collaborative Social Networking Service among User, location and trajectory. Invited paper, in IEEE Data Engineering Bulletin. 33, 2, 2010, pp. 32-40.


New stable release of GRASS GIS 7.0.0!

The GRASS GIS Development team has announced the release of the new major version GRASS GIS 7.0.0. This version provides many new functionalities including spatio-temporal database support, image segmentation, estimation of evapotranspiration and emissivity from satellite imagery, automatic line vertex densification during reprojection, more LIDAR support and a strongly improved graphical user interface experience. GRASS GIS 7.0.0 also offers significantly improved performance for many raster and vector modules: “Many processes that would take hours now take less than a minute, even on my small laptop!” explains Markus Neteler, the coordinator of the development team composed of academics and GIS professionals from around the world. The software is available for Linux, MS-Windows, Mac OSX and other operating systems.

Detailed announcement and software download:
http://grass.osgeo.org/news/42/15/GRASS-GIS-7-0-0/

About GRASS GIS
The Geographic Resources Analysis Support System (http://grass.osgeo.org/), commonly referred to as GRASS GIS, is an open source Geographic Information System providing powerful raster, vector and geospatial processing capabilities in a single integrated software suite. GRASS GIS includes tools for spatial modeling, visualization of raster and vector data, management and analysis of geospatial data, and the processing of satellite and aerial imagery. It also provides the capability to produce sophisticated presentation graphics and hardcopy maps. GRASS GIS has been translated into about twenty languages and supports a huge array of data formats. It can be used either as a stand-alone application or as backend for other software packages such as QGIS and R geostatistics. It is distributed freely under the terms of the GNU General Public License (GPL). GRASS GIS is a founding member of the Open Source Geospatial Foundation (OSGeo).

The post New stable release of GRASS GIS 7.0.0! appeared first on GFOSS Blog | GRASS GIS Courses.

Experiments with Conway’s Game of Life

This experiment is motivated by a discussion I had with Dr. Claus Rinner about introducing students to GIS concepts using Conway’s Game of Life. Conway’s Game of Life is a popular example to demonstrate cellular automata. Based on an input grid of “alive” and “dead” cells, new cell values are computed on each iteration based on four simple rules for the cell and its 8 neighbors:

  1. Any live cell with fewer than two live neighbours dies, as if caused by under-population.
  2. Any live cell with two or three live neighbours lives on to the next generation.
  3. Any live cell with more than three live neighbours dies, as if by overcrowding.
  4. Any dead cell with exactly three live neighbours becomes a live cell, as if by reproduction.

(Source: Wikipedia – Conway’s Game of Life)

Based on these simple rules, effects like the following “glider gun” can be achieved:

Gospers glider gun.gif
Gospers glider gun” by KieffOwn work. Licensed under CC BY-SA 3.0 via Wikimedia Commons.

There are some Game of Life implementations for GIS out there, e.g. scripts for ArcGIS or a module for SAGA. Both of these examples are raster-based. Since I couldn’t find any examples of raster manipulation like this in pyQGIS, I decided to instead implement a vector version: a Processing script which receives an input grid of cells and outputs the next iteration based on the rules of Game of Life. In the following screencast, you can see the Processing script being called repeatedly by a script from the Python console:

So far, it’s a quick and dirty first implementation. To make it more smooth, I’m considering adding spatial indexing and using memory layers instead of having Processing create a bunch of Shapefiles.

It would also be interesting to see a raster version done in PyQGIS. Please leave a comment if you have any ideas how this could be achieved.


Rendering a brain CT scan in 3D with GRASS GIS 7

brainscan1Last year (2013) I “enjoyed” a brain CT scan in order to identify a post-surgery issue – luckily nothing found. Being in Italy, like all patients I received a CD-ROM with the scan data on it: so, something to play with! In this article I’ll show how to easily turn 2D scan data into a volumetric (voxel) visualization.

The CT scan data come in a DICOM format which ImageMagick is able to read and convert. Knowing that, we furthermore need the open source software packages GRASS GIS 7 and Paraview to get the job done.

First of all, we create a new XY (unprojected) GRASS location to import the data into:

# create a new, empty location (or use the Location wizard):
grass70 -c ~/grassdata/brain_ct

We now start GRASS GIS 7 with that location. After mounting the CD-ROM we navigate into the image directory therein. The directory name depends on the type of CT scanner which was used in the hospital. The file name suffix may be .IMA.

Now we count the number of images, convert and import them into GRASS GIS:

# list and count
LIST=`ls -1 *.IMA`
MAX=`echo $LIST | wc -w`

# import into XY location:
curr=1
for i in $LIST ; do

# pretty print the numbers to 000X for easier looping:
curr=`echo $curr | awk ‘{printf “%04d\n”, $1}’`
convert “$i” brain.$curr.png
r.in.gdal in=brain.$curr.png out=brain.$curr
r.null brain.$curr setnull=0
rm -f brain.$curr.png
curr=`expr $curr + 1`

done

At this point all CT slices are imported in an ordered way. For extra fun, we can animate the 2D slices in g.gui.animation:

Animation of brain scan slices
(click to enlarge)

# enter in one line:
g.gui.animation rast=`g.mlist -e rast separator=comma pattern=”brain*”`

The tool allows to export as animated GIF or AVI:

Animation of brain scan slices (click to enlarge)

Now it is time to generate a volume:

# first count number of available layers
g.mlist rast pat=”brain*” | wc -l

# now set 3D region to number of available layers (as number of depths)
g.region rast=brain.0003 b=1 t=$MAX -p3

At this point the computational region is properly defined to our 3D raster space. Time to convert the 2D slices into voxels by stacking them on top of each other:

# convert 2D slices to 3D slices:
r.to.rast3 `g.mlist rast pat=”brain*” sep=,` out=brain_vol

We can now look at the volume with GRASS GIS’ wxNVIZ or preferably the extremely powerful Paraview. The latter requires an export of the volume to VTK format:

# fetch some environment variables
eval `g.gisenv -s`
# export GRASS voxels to VTK 3D as 3D points, with scaled z values:
SCALE=2
g.message “Exporting to VTK format, scale factor: $SCALE”
r3.out.vtk brain_vol dp=2 elevscale=$SCALE \
output=${PREFIX}_${MAPSET}_brain_vol_scaled${SCALE}.vtk -p

Eventually we can open this new VTK file in Paraview for visual exploration:

# show as volume
# In Paraview: Properties: Apply; Display Repres: volume; etc.
paraview –data=brain_s1_vol_scaled2.vtk

markus_brain_ct_scan3 markus_brain_ct_scan4 markus_brain_ct_scan2

 

 

 

 

 

 

 

 

 

 

 

 

Fairly easy!

BTW: I have a scan of my non-smoker lungs as well :-)

The post Rendering a brain CT scan in 3D with GRASS GIS 7 appeared first on GFOSS Blog | GRASS GIS Courses.

Visualizing direction-dependent values

When mapping flows or other values which relate to a certain direction, styling these layers gets interesting. I faced the same challenge when mapping direction-dependent error values. Neighboring cell pairs were connected by two lines, one in each direction, with an associated error value. This is what I came up with:

srtm_errors_1200px

Each line is drawn with an offset to the right. The size of the offset depends on the width of the line which in turn depends on the size of the error. You can see the data-defined style properties here:

directed_error_style

To indicate the direction, I added a marker line with one > marker at the center. This marker line also got assigned the same offset to match the colored line bellow. I’m quite happy with how these turned out and would love to hear about your approaches to this issue.

srtm_errors_detail

These figures are part of a recent publication with my AIT colleagues: A. Graser, J. Asamer, M. Dragaschnig: “How to Reduce Range Anxiety? The Impact of Digital Elevation Model Quality on Energy Estimates for Electric Vehicles” (2014).


3D viz with QGIS & three.js

If you are looking for a tool to easily create 3D visualizations of your geodata, look no further! Qgis2threejs is a plugin by Minoru Akagi which exports terrain data combined with the map canvas image and optional vector data to an html file which can be viewed in 3D in any web browser which supports WebGL. To do that, this plugin uses the Three.js library.

This is the result of my first experiments with Qgis2threejs. In the following sections, I will show the steps to reproduce it.

Türkenschanzpark, Vienna

click for the interactive version (requires WebGL-capable browser)

1. The data

The building blocks of this visualization are:

  • elevation data and the hillshade derived from this data
  • a base map (WMTS from basemap.at in my case)
  • OSM building data provided by Geofabrik and
  • tree data from the city of Vienna

Load all datasets into QGIS.

2. Preparing the map

Qgis2threejs will overlay the map (as rendered in the QGIS map area) on top of the elevation model. You can combine any number of layers to create your map. I just loaded a basemap.at WMTS and a hillshade layer. To add a nice tree shadow effect, I also added the tree layer (dark grey, 50% transparency, multiply blending).

tuerkenschanzpark_map

3. Preparing the vector features

The vector features in the visualization are buildings and trees. The buildings are based on an OSM building layer. The trees are create from two point layers: one point layer to create the tree trunks (cylinder shape) and a duplicate of this point layer to create the tree crowns (sphere shape).

Load the data and choose the desired fill colors.

4. Using Qgis2threejs

Now we can start Qgis2threejs. The first tab is used to configure the terrain. Just pick the correct elevation data layer. I didn’t modify any of the other default settings.

qgis2threejs_dem

The second tab provides the settings for the vector data. As mentioned in the previous section, the trees are created from two point layers and the buildings are based on a polygon layer. The tree crowns are spheres with a radius size 3 and a z value of 5 above the surface. The tree trunks are cylinders. Finally, the buildings have a height of 10.

qgis2threejs_vector

That’s it! Just press “run” and wait. When the export is finished, your default browser (or a different one, if you specify another one in the plugin settings) will open automatically and display the results.
The visualization is interactive. You can tilt the visualization using the left mouse button, pan using the right mouse button, and zoom using the mouse wheel. I found that Firefox used around 1.6 GB of RAM to render this example.

5. Share your visualization

In the browser window, you will see where Qgis2threejs stored the html and associated Javascript files. To share your visualization, you just need to copy these files onto a webserver.

I would love to see what you come up with. Please share a link in the comments.


Data-defined properties in QGIS 2.0

In QGIS 2.0, the old “size scale” field has been replaced by data-defined properties which enable us to control many more properties than jut size and rotation. One of the often requested features – for example – is the possibility for data-defined colors:

datadefinedproperties

Today’s example map visualizes a dataset of known meteorite landings published on http://visualizing.org/datasets/meteorite-landings. I didn’t clean the data, so there is quite a bunch of meteorites at 0/0.

To create the map, I used QGIS 2.0 feature blending mode “multiply” as well as data-defined size based on meteorite mass:

meteorites1

Background oceans and graticule by NaturalEarthData.


Dataviz with OpenSource Tools

Today, I’ve finished my submission for the Hubway Data Visualization Challenge. All parts of the resulting dataviz were created using open source tools. My toolbox for this work contains: QGIS, Spatialite, Inkscape, Gimp and Open Office Calc. To see the complete submission and read more about it, check the project page.


Mapping Hubway Station Stats

Today, I’ve been working on some station statistics. From the trip data, I calculated incoming and outgoing trips per station as well as the station’s first day of operations. Combining this information makes it possible to calculate the average day’s “bike balance”. A balanced station has the same number of incoming and outgoing trips while an unbalanced station will either run out of bikes or empty slots for returns.

I’ve published the resulting station map on QGIS Cloud (http://qgiscloud.com/anitagraser/hubway_cloud1) where you can have a look at the bike balance values.

Additionally, I’ve created a mashup in Leaflet pulling together background tiles from Stamen and the cloud-hosted WMS for better orientation:


Exploring Hubway’s Data II

Today, I’ve been experimenting with a new way to visualize origin-destination pairs (ODs). The following image shows my first results:

The ideas was to add a notion of direction as well as uncertainty. The “flower petals” have a pointed origin and grow wider towards the middle. (Looking at the final result, they should probably go much narrower towards the end again.) The area covered by the petals is a simple approximation of where I’d expect the bike routes without performing any routing.

To get there, I reprojected the connection lines to EPSG:3857 and calculated connection length and line orientation using QGIS Field Calculator $length operator and the bearing formula given in QGIS Wiki:

(atan((xat(-1)-xat(0))/(yat(-1)-yat(0)))) * 180/3.14159 + (180 *(((yat(-1)-yat(0)) < 0) + (((xat(-1)-xat(0)) < 0 AND (yat(-1) - yat(0)) >0)*2)))

For the style, I created a new “flower petal” SVG symbol in Inkscape and styled it with varying transparency values: Rare connections are more transparent than popular ones. This style is applied to the connection start points. Using the advanced options “size scale” and “rotation”, it is possible to rotate the petals into the right direction as well as scale them using the previously calculated values for connection length and orientation.

Update

While the above example uses pretty wide petals this one is done with a much narrower petal. I think it’s more appropriate for the data at hand:

Most of the connections are clearly heading south east, across Charles River, except for that group of connections pointing the opposite direction, to Harvard Square.


Exploring Hubway’s Data I

Hubway is a bike sharing system in Boston and they are currently hosting a data visualization challenge. What a great chance to play with some real-world data!

To get started, I loaded both station Shapefile and trip CSV into a new Spatialite database. The GUI is really helpful here – everything is done in a few clicks. Afterwards, I decided to look into which station combinations are most popular. The following SQL script creates my connections table:

create table connections (
start_station_id INTEGER,
end_station_id INTEGER,
count INTEGER,
Geometry GEOMETRY);


insert into connections select 
start_station_id, 
end_station_id, 
count(*) as count, 
LineFromText('LINESTRING('||X(a.Geometry)||' '||Y(a.Geometry)||','
                          ||X(b.Geometry)||' '||Y(b.Geometry)||')') as Geometry
 from trips, stations a, stations b
where start_station_id = a.ID 
and end_station_id = b.ID
and a.ID != b.ID
and a.ID is not NULL
and b.ID is not NULL
group by start_station_id, end_station_id;

(Note: This is for Spatialite 2.4, so there is no MakeLine() method. Use MakeLine if you are using 3.0.)

For a first impression, I decided to map popular connections with more than one hundred entries. Wider lines mean more entries. The points show the station locations and they are color coded by starting letter. (I’m not yet sure if they mean anything. They seem to form groups.)

Some of the stations don’t seem to have any strong connections at all. Others are rather busy. The city center and the dark blue axis pointing west seem most popular.

I’m really looking forward to what everyone else will be finding in this dataset.


Exploring Mobility Data Using Time Manager

Data from various vehicles is collected for many purposes in cities worldwide. To get a feeling for just how much data is available, I created the following video using QGIS Time Manager which has been shown at the Austrian Museum of Applied Arts “MADE 4 YOU – Design for Change”. It shows one hour of taxi tracks in the city of Vienna:

If you like the video, please go to http://www.ertico.com/2012-its-video-competition-open-vote and vote for it in the category “Videos directed at the general public”.


Space-Time Cubes – Exploring Twitter Streams III

This post continues my quest of exploring the spatial dimension of Twitter streams. I wanted to try one of the classic spatio-temporal visualization methods: Space-time cubes where the vertical axis represents time while the other two map space. Like the two previous examples, this visualization is written in pyprocessing, a Python port of the popular processing environment.

This space-time cube shows twitter trajectories that contain at least one tweet in New York Times Square. The 24-hour day starts at the bottom of the cube and continues to the top. Trajectories are colored based on the time stamp of their start tweet.

Additionally, all trajectories are also drawn in context of the coastline (data: OpenStreetMap) on the bottom of the cube.

While there doesn’t seem to be much going on in the early morning hours, we can see quite a busy coming and going during the afternoon and evening. From the bunch of vertical lines over Times Square, we can also assume that some of our tweet authors spent a considerable time at and near Times Square.

I’ve also created an animated version. Again, I recommend to watch it in HD.


Pyprocessing for 3D Animations

I’ve been looking into the 3D capabilities of Pyprocessing for the creation of animated space-time cubes.

There are subtle differences between Processing and Pyprocessing. Processing is documented pretty well but I prefer Python over Java any time. So here is my port of the Processing “Cubes within Cube” example as a reference for how 3D animations are done in Pyprocessing.


(You can watch the animation live on the Processing site.)

from pyprocessing import *
from random import random

cubies = 20
c = [0]*cubies
quadBG = [[None]*6]*cubies

# Controls cubie's movement
x = [0.0]*cubies
y = [0.0]*cubies
z = [0.0]*cubies
xSpeed = [0.0]*cubies
ySpeed = [0.0]*cubies
zSpeed = [0.0]*cubies

# Controls cubie's rotation
xRot = [0.0]*cubies
yRot = [0.0]*cubies
zRot = [0.0]*cubies

stage = None

# Size of external cube
bounds = 300.0

def setup():
  size(640, 360);
  
  for i in range(0,cubies):
    # Each cube face has a random color component
    quadBG[i][0] = color(0)
    quadBG[i][1] = color(51)
    quadBG[i][2] = color(102)
    quadBG[i][3] = color(153)
    quadBG[i][4] = color(204)
    quadBG[i][5] = color(255)

    # Cubies are randomly sized
    cubieSize = random()*10+5
    c[i] =  Cube(cubieSize, cubieSize, cubieSize)

    # Initialize cubie's position, speed and rotation
    x[i] = 0.0
    y[i] = 0.0
    z[i] = 0.0

    xSpeed[i] = random()*4-2
    ySpeed[i] = random()*4-2
    zSpeed[i] = random()*4-2

    xRot[i] = random()*60+40
    yRot[i] = random()*60+40
    zRot[i] = random()*60+40
  

def draw():
  background(50)
  lights()
  
  # Center in display window
  translate(width/2, height/2, -130)
  
  # Outer transparent cube
  noFill()
  
  # Rotate everything, including external large cube
  rotateX(frame.count * 0.001)
  rotateY(frame.count * 0.002)
  rotateZ(frame.count * 0.001)
  stroke(255)
  
  # Draw external large cube
  stage =  Cube(bounds, bounds, bounds);
  stage.create()

  # Move and rotate cubies
  for i in range(0,cubies):
    pushMatrix()
    translate(x[i], y[i], z[i])
    rotateX(frame.count*PI/xRot[i])
    rotateY(frame.count*PI/yRot[i])
    rotateX(frame.count*PI/zRot[i])
    noStroke()
    c[i].create(quadBG[i])
    x[i] += xSpeed[i]
    y[i] += ySpeed[i]
    z[i] += zSpeed[i]
    popMatrix()

    # Draw lines connecting cubbies
    stroke(0)
    if i < cubies-1:
      line(x[i], y[i], z[i], x[i+1], y[i+1], z[i+1])

    # Check wall collisions
    if x[i] > bounds/2 or x[i] < -bounds/2:
      xSpeed[i]*=-1
    
    if y[i] > bounds/2 or y[i] < -bounds/2:
      ySpeed[i]*=-1
    
    if z[i] > bounds/2 or z[i] < -bounds/2:
      zSpeed[i]*=-1
    


# Custom Cube Class

class Cube():
  def __init__(self,w,h,d):
    self.vertices = [0]*24
    self.w = w;
    self.h = h;
    self.d = d;

    # cube composed of 6 quads
    #front
    self.vertices[0] =  PVector(-w/2,-h/2,d/2)
    self.vertices[1] =  PVector(w/2,-h/2,d/2)
    self.vertices[2] =  PVector(w/2,h/2,d/2)
    self.vertices[3] =  PVector(-w/2,h/2,d/2)
    #left
    self.vertices[4] =  PVector(-w/2,-h/2,d/2)
    self.vertices[5] =  PVector(-w/2,-h/2,-d/2)
    self.vertices[6] =  PVector(-w/2,h/2,-d/2)
    self.vertices[7] =  PVector(-w/2,h/2,d/2)
    #right
    self.vertices[8] =  PVector(w/2,-h/2,d/2)
    self.vertices[9] =  PVector(w/2,-h/2,-d/2)
    self.vertices[10] =  PVector(w/2,h/2,-d/2)
    self.vertices[11] =  PVector(w/2,h/2,d/2)
    #back
    self.vertices[12] =  PVector(-w/2,-h/2,-d/2)
    self.vertices[13] =  PVector(w/2,-h/2,-d/2)
    self.vertices[14] =  PVector(w/2,h/2,-d/2)
    self.vertices[15] =  PVector(-w/2,h/2,-d/2)
    #top
    self.vertices[16] =  PVector(-w/2,-h/2,d/2)
    self.vertices[17] =  PVector(-w/2,-h/2,-d/2)
    self.vertices[18] =  PVector(w/2,-h/2,-d/2)
    self.vertices[19] =  PVector(w/2,-h/2,d/2)
    #bottom
    self.vertices[20] =  PVector(-w/2,h/2,d/2)
    self.vertices[21] =  PVector(-w/2,h/2,-d/2)
    self.vertices[22] =  PVector(w/2,h/2,-d/2)
    self.vertices[23] =  PVector(w/2,h/2,d/2)
  
  def create(self,quadBG=None):
    # Draw cube
    for i in range(0,6):
      if quadBG:
          fill(quadBG[i])
      beginShape(QUADS)
      for j in range(0,4):
        vertex(self.vertices[j+4*i].x, self.vertices[j+4*i].y, self.vertices[j+4*i].z)
      endShape()

run()


A Visual Exploration of Twitter Streams II

After my first shot at analyzing Twitter data visually I received a lot of great feedback. Thank you!

For my new attempt, I worked on incorporating your feedback such as: filter unrealistic location changes, show connections “grow” instead of just popping up and zoom to an interesting location. The new animation therefore focuses on Manhattan – one of the places with reasonably high geotweet coverage.

The background is based on OpenStreetMap coastline data which I downloaded using QGIS OSM plugin and rendered in pyprocessing together with the geotweets. To really see what’s going on, switch to HD resolution and full screen:

It’s pretty much work-in-progress. The animation shows similar chaotic patterns seen in other’s attempts at animating tweets. To me, the distribution of tweets looks reasonable and many of the connection lines seem to actually coincide with the bridges spanning to and from Manhattan.

This work is an attempt at discovering the potential of Twitter data and at the same time learning some pyprocessing which will certainly be useful for many future tasks. The next logical step seems to be to add information about interactions between users and/or to look at the message content. Another interesting task would be to add interactivity to the visualization.


  • <<
  • Page 2 of 3 ( 53 posts )
  • >>
  • visualization

Back to Top

Sustaining Members