Related Plugins and Tags

QGIS Planet

Trajectools tutorial: trajectory preprocessing

Today marks the release of Trajectools 2.3 which brings a new set of algorithms, including trajectory generalizing, cleaning, and smoothing.

To give you a quick impression of what some of these algorithms would be useful for, this post introduces a trajectory preprocessing workflow that is quite general-purpose and can be adapted to many different datasets.

We start out with the Geolife sample dataset which you can find in the Trajectools plugin directory’s sample_data subdirectory. This small dataset includes 5908 points forming 5 trajectories, based on the trajectory_id field:

We first split our trajectories by observation gaps to ensure that there are no large gaps in our trajectories. Let’s make at cut at 15 minutes:

This splits the original 5 trajectories into 11 trajectories:

When we zoom, for example, to the two trajectories in the north western corner, we can see that the trajectories are pretty noisy and there’s even a spike / outlier at the western end:

If we label the points with the corresponding speeds, we can see how unrealistic they are: over 300 km/h!

Let’s remove outliers over 50 km/h:

Better but not perfect:

Let’s smooth the trajectories to get rid of more of the jittering.

(You’ll need to pip/mamba install the optional stonesoup library to get access to this algorithm.)

Depending on the noise values we chose, we get more or less smoothing:

Let’s zoom out to see the whole trajectory again:

Feel free to pan around and check how our preprocessing affected the other trajectories, for example:

MovingPandas 0.19 released!

This release is the first to support GeoPandas 1.0.

Additionally, this release adds multiple new features, including:

For the full change log, check out the release page.

We have also revamped the documentation at https://movingpandas.readthedocs.io/ using the PyData Sphinx Theme:

On a related note: if you know what I need to change to get all Trajectory functions listed in the TOC on the right, please let me know.

New MovingPandas tutorial: taking OGC Moving Features full circle with MF-JSON

Last week, I had the pleasure to meet some of the people behind the OGC Moving Features Standard Working group at the IEEE Mobile Data Management Conference (MDM2024). While chatting about the Moving Features (MF) support in MovingPandas, I realized that, after the MF-JSON update & tutorial with official sample post, we never published a complete tutorial on working with MF-JSON encoded data in MovingPandas.

The current MovingPandas development version (to be release as version 0.19) supports:

  • Reading MF-JSON MovingPoint (single trajectory features and trajectory collections)
  • Reading MF-JSON Trajectory
  • Writing MovingPandas Trajectories and TrajectoryCollections to MF-JSON MovingPoint

This means that we can now go full circle: reading — writing — reading.

Reading MF-JSON

Both MF-JSON MovingPoint encoding and Trajectory encoding can be read using the MovingPandas function read_mf_json(). The complete Jupyter notebook for this tutorial is available in the project repo.

Here we read one of the official MF-JSON MovingPoint sample files:

traj = mpd.read_mf_json('data/movingfeatures.json')

Writing MF-JSON

To write MF-JSON, the Trajectory and TrajectoryCollection classes provide a to_mf_json() function:

The resulting Python dictionary in MF-JSON MovingPoint encoding can then be saved to a JSON file, and then read again:

import json
with open('mf1.json', 'w') as json_file:
    json.dump(mf_json, json_file, indent=4)

Similarly, we can read any arbitrary trajectory data set and save it to MF-JSON.

For example, here we use our usual Geolife sample:

gdf = gp.read_file('data/demodata_geolife.gpkg')
tc = mpd.TrajectoryCollection(gdf, 'trajectory_id', t='t')
mf_json = tc.to_mf_json(temporal_columns=['sequence'])

And reading again

import json
with open('mf5.json', 'w') as json_file:
    json.dump(mf_json, json_file, indent=4)
tc = mpd.read_mf_json('mf5.json', traj_id_property='trajectory_id' )

Conclusion

The implemented MF-JSON support covers the basic usage of the encodings. There are some fine details in the standard, such as the distinction of time-varying attribute with linear versus step-wise interpolation, which MovingPandas currently does not support.

If you are working with movement data, I would appreciate if you can give the improved MF-JSON support a spin and report back with your experiences.

New interactive trajectory plots for MovingPandas & experiments on their interpretation using ChatGPT 4o

With the release of GeoPandas 1.0 this month, we’ve been finally able to close a long-standing issue in MovingPandas by adding support for the explore function which provides interactive maps using Folium and Leaflet.

Explore() will be available in the upcoming MovingPandas 0.19 release if your Python environment includes GeoPandas >= 1.0 and Folium. Of course, if you are curious, you can already test this new functionality using the current development version.

This enables users to access interactive trajectory plots even in environments where it is not possible to install geoviews / hvplot (the previously only option for interactive plots in MovingPandas).

I really like the legend for the speed color gradient, but unfortunately, the legend labels are not readable on the dark background map since they lack the semi-transparent white background that has been applied to the scale bar and credits label.

Speaking of reading / interpreting the plots …

You’ve probably seen the claims that AI will help make tools more accessible. Clearly AI can interpret and describe photos, but can it also interpret MovingPandas plots?

ChatGPT 4o interpretations of MovingPandas plots

Not bad.

And what happens if we ask it to interpret the animated GIF from the beginning of the blog post?

So it looks like ChatGPT extracts 12 frames and analyzes them to answer our question:

Its guesses are not completely off but it made up the facts such as that the view shows “how traffic speeds vary over time”.

The problem remains that models such as ChatGPT rather make up interpretations than concede when they do not have enough information to make a reliable statement.

New Trajectools 2.1 and MovingPandas 0.18 releases

Today marks the 2.1 release of Trajectools for QGIS. This release adds multiple new algorithms and improvements. Since some improvements involve upstream MovingPandas functionality, I recommend to also update MovingPandas while you’re at it.

If you have installed QGIS and MovingPandas via conda / mamba, you can simply:

conda activate qgis
mamba install movingpandas=0.18

Afterwards, you can check that the library was correctly installed using:

import movingpandas as mpd
mpd.show_versions()

Trajectools 2.1

The new Trajectools algorithms are:

  • Trajectory overlay — Intersect trajectories with polygon layer
  • Privacy — Home work attack (requires scikit-mobility)
    • This algorithm determines how easy it is to identify an individual in a dataset. In a home and work attack the adversary knows the coordinates of the two locations most frequently visited by an individual.
  • GTFS — Extract segments (requires gtfs_functions)
  • GTFS — Extract shapes (requires gtfs_functions)

Furthermore, we have fixed issue with previously ignored minimum trajectory length settings.

Scikit-mobility and gtfs_functions are optional dependencies. You do not need to install them, if you do not want to use the corresponding algorithms. In any case, they can be installed using mamba and pip:

mamba install scikit-mobility
pip install gtfs_functions

MovingPandas 0.18

This release adds multiple new features, including

  • Method chaining support for add_speed(), add_direction(), and other functions
  • New TrajectoryCollection.get_trajectories(obj_id) function
  • New trajectory splitter based on heading angle
  • New TrajectoryCollection.intersection(feature) function
  • New plotting function hvplot_pts()
  • Faster TrajectoryCollection operations through multi-threading
  • Added moving object weights support to trajectory aggregator

For the full change log, check out the release page.

Trajectools 2.0 released 🎉

It’s my pleasure to share with you that Trajectools 2.0 just landed in the official QGIS Plugin Repository.

This is the first version without the “experimental” flag. If you look at the plugin release history, you will see that the previous release was from 2020. That’s quite a while ago and a lot has happened since, including the development of MovingPandas.

Let’s have a look what’s new!

The old “Trajectories from point layer”, “Add heading to points”, and “Add speed (m/s) to points” algorithms have been superseded by the new “Create trajectories” algorithm which automatically computes speeds and headings when creating the trajectory outputs.

“Day trajectories from point layer” is covered by the new “Split trajectories at time intervals” which supports splitting by hour, day, month, and year.

“Clip trajectories by extent” still exists but, additionally, we can now also “Clip trajectories by polygon layer”

There are two new event extraction algorithms to “Extract OD points” and “Extract OD points”, as well as the related “Split trajectories at stops”. Additionally, we can also “Split trajectories at observation gaps”.

Trajectory outputs, by default, come as a pair of a point layer and a line layer. Depending on your use case, you can use both or pick just one of them. By default, the line layer is styled with a gradient line that makes it easy to see the movement direction:

while the default point layer style shows the movement speed:

How to use Trajectools

Trajectools 2.0 is powered by MovingPandas. You will need to install MovingPandas in your QGIS Python environment. I recommend installing both QGIS and MovingPandas from conda-forge:

(base) conda create -n qgis -c conda-forge python=3.9 
(base) conda activate qgis
(qgis) mamba install -c conda-forge qgis movingpandas

The plugin download includes small trajectory sample datasets so you can get started immediately.

Outlook

There is still some work to do to reach feature parity with MovingPandas. Stay tuned for more trajectory algorithms, including but not limited to down-sampling, smoothing, and outlier cleaning.

I’m also reviewing other existing QGIS plugins to see how they can complement each other. If you know a plugin I should look into, please leave a note in the comments.

Trajectools update: stop detection & trajectory styling

The Trajectools toolbox has continued growing:

I’m continuously testing the algorithms integrated so far to see if they work as GIS users would expect and can to ensure that they can be integrated in Processing model seamlessly.

Because naming things is tricky, I’m currently struggling with how to best group the toolbox algorithms into meaningful categories. I looked into the categories mentioned in OGC Moving Features Access but honestly found them kind of lacking:

Andrienko et al.’s book “Visual Analytics of Movement” comes closer to what I’m looking for:

… but I’m not convinced yet. So take the above listed three categories with a grain of salt. Those may change before the release. (Any inputs / feedback / recommendation welcome!)

Let me close this quick status update with a screencast showcasing stop detection in AIS data, featuring the recently added trajectory styling using interpolated lines:

While Trajectools is getting ready for its 2.0 release, you can get the current development version directly from https://github.com/movingpandas/qgis-processing-trajectory.

QGIS Processing Trajectools v2 in the works

Trajectools development started back in 2018 but has been on hold since 2020 when I realized that it would be necessary to first develop a solid trajectory analysis library. With the MovingPandas library in place, I’ve now started to reboot Trajectools.

Trajectools v2 builds on MovingPandas and exposes its trajectory analysis algorithms in the QGIS Processing Toolbox. So far, I have integrated the basic steps of

  1. Building trajectories including speed and direction information from timestamped points and
  2. Splitting trajectories at observation gaps, stops, or regular time intervals.

The algorithms create two output layers:

  • Trajectory points with speed and direction information that are styled using arrow markers
  • Trajectories as LineStringMs which makes it straightforward to count the number of trajectories and to visualize where one trajectory ends and another starts.

So far, the default style for the trajectory points is hard-coded to apply the Turbo color ramp on the speed column with values from 0 to 50 (since I’m simply loading a ready-made QML). By default, the speed is calculated as km/h but that can be customized:

I don’t have a solution yet to automatically create a style for the trajectory lines layer. Ideally, the style should be a categorized renderer that assigns random colors based on the trajectory id column. But in this case, it’s not enough to just load a QML.

In the meantime, I might instead include an Interpolated Line style. What do you think?

Of course, the goal is to make Trajectools interoperable with as many existing QGIS Processing Toolbox algorithms as possible to enable efficient Mobility Data Science workflows.

The easiest way to set up QGIS with MovingPandas Python environment is to install both from conda. You can find the instructions together with the latest Trajectools development version at: https://github.com/movingpandas/qgis-processing-trajectory


This post is part of a series. Read more about movement data in GIS.

Analyzing mobility hotspots with MovingPandas & CARTO

Today, I want to point out a blog post over at

https://carto.com/blog/analyzing-mobility-hotspots-with-movingpandas

written together with my fellow co-authors and EMERALDS project team members Argyrios Kyrgiazos and Helen McKenzie.

In this blog post, we walk you through a trajectory hotspot analysis using open taxi trajectory data from Kaggle, combining data preparation with MovingPandas (including the new OutlierCleaner illustrated above) and spatiotemporal hotspot analysis from Carto.

MovingPandas v0.17 released!

Over the last couple of months, I have not been posting release announcements here, so there is quite a bit to catch up.

The latest v0.17.2 release is now available from conda-forge.

New features (since 0.14):

  • Improved MovingFeatures MF-JSON support
    • Ability to parse a MovingFeatureCollection from a json file #330
    • GeoDataFrame to MF-JSON #325
    • Adding read_mf_dict function #357
  • New OutlierCleaner #334
  • Faster stop detection #316
  • New arrow markers to indicate trajectory direction in plots fb1174b 
  • Distance, speed, and acceleration unit handling #295
  • New aggregation parameter (agg) for to_traj_gdf() 5745068 
  • New get_segments_between() for TrajectoryCollection #287 

Behind the scenes:

  • We now have a dedicated Github organization: https://github.com/movingpandas that houses all related repositories
  • And we finally added https support to the website

As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:

If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our discussion forum.

Analyzing video-based bicycle trajectories

Did you know that MovingPandas also supports local image coordinates? Indeed, it does.

In today’s post, we will explore how we can use this feature to analyze bicycle tracks extracted from video footage published by Michael Szell @mszll:

The bicycle trajectory coordinates are stored in two separate lists: xs_640x360 and ys640x360:

This format is kind of similar to the Kaggle Taxi dataset, we worked with in the previous post. However, to use the solution we implemented there, we need to combine the x and y coordinates into nice (x,y) tuples:

df['coordinates'] = df.apply(
    lambda row: list(zip(row['xs_640x360'], row['ys_640x360'])), axis=1)
df.drop(columns=['xs_640x360', 'ys_640x360'], inplace=True)

Afterwards, we can create the points and compute the proper timestamps from the frame numbers:

def compute_datetime(row):
    # some educated guessing going on here: the paper states that the video covers 2021-06-09 07:00-08:00
    d = datetime(2021,6,9,7,0,0) + (row['frame_in'] + row['running_number']) * timedelta(seconds=2)
    return d
def create_point(xy):
    try: 
        return Point(xy)
    except TypeError:  # when there are nan values in the input data
        return None
new_df = df.head().explode('coordinates')
new_df['geometry'] = new_df['coordinates'].apply(create_point)
new_df['running_number'] = new_df.groupby('id').cumcount()
new_df['datetime'] = new_df.apply(compute_datetime, axis=1)
new_df.drop(columns=['coordinates', 'frame_in', 'running_number'], inplace=True)
new_df

Once the points and timestamps are ready, we can create the MovingPandas TrajectoryCollection. Note how we explicitly state that there is no CRS for this dataset (crs=None):

trajs = mpd.TrajectoryCollection(
    gpd.GeoDataFrame(new_df), 
    traj_id_col='id',  t='datetime', crs=None)

Plotting trajectories with image coordinates

Similarly, to plot these trajectories, we should tell hvplot that it should not fetch any background map tiles (’tiles’:None) and that the coordinates are not geographic (‘geo’:False):

If you want to explore the full source code, you can find my Github fork with the Jupyter notebook at: https://github.com/anitagraser/desirelines/blob/main/mpd.ipynb

The repository also contains a camera image of the intersection, which we can use as a background for our trajectory plots:

bg_img = hv.RGB.load_image('img/intersection2.png', bounds=(0,0,640,360)) 

One important caveat is that speed will be calculated in pixels per second. So when we plot the bicycle speed, the segments closer to the camera will appear faster than the segments in the background:

To fix this issue, we would have to correct for the distortions of the camera lens and perspective. I’m sure that there is specialized software for this task but, for the purpose of this post, I’m going to grab the opportunity to finally test out the VectorBender plugin.

Georeferencing the trajectories using QGIS VectorBender plugin

Let’s load the five test trajectories and the camera image to QGIS. To make sure that they align properly, both are set to the same CRS and I’ve created the following basic world file for the camera image:

1
0
0
-1
0
360

Then we can use the VectorBender tools to georeference the trajectories by linking locations from the camera image to locations on aerial images. You can see the whole process in action here:

After around 15 minutes linking control points, VectorBender comes up with the following georeferenced trajectory result:

Not bad for a quick-and-dirty hack. Some points on the borders of the image could not be georeferenced since I wasn’t always able to identify suitable control points at the camera image borders. So it won’t be perfect but should improve speed estimates.


This post is part of a series. Read more about movement data in GIS.

How to use Kaggle’s Taxi Trajectory Data in MovingPandas

Kaggle’s “Taxi Trajectory Data from ECML/PKDD 15: Taxi Trip Time Prediction (II) Competition” is one of the most used mobility / vehicle trajectory datasets in computer science. However, in contrast to other similar datasets, Kaggle’s taxi trajectories are provided in a format that is not readily usable in MovingPandas since the spatiotemporal information is provided as:

  • TIMESTAMP: (integer) Unix Timestamp (in seconds). It identifies the trip’s start;
  • POLYLINE: (String): It contains a list of GPS coordinates (i.e. WGS84 format) mapped as a string. The beginning and the end of the string are identified with brackets (i.e. [ and ], respectively). Each pair of coordinates is also identified by the same brackets as [LONGITUDE, LATITUDE]. This list contains one pair of coordinates for each 15 seconds of trip. The last list item corresponds to the trip’s destination while the first one represents its start;

Therefore, we need to create a DataFrame with one point + timestamp per row before we can use MovingPandas to create Trajectories and analyze them.

But first things first. Let’s download the dataset:

import datetime
import pandas as pd
import geopandas as gpd
import movingpandas as mpd
import opendatasets as od
from os.path import exists
from shapely.geometry import Point

input_file_path = 'taxi-trajectory/train.csv'

def get_porto_taxi_from_kaggle():
    if not exists(input_file_path):
        od.download("https://www.kaggle.com/datasets/crailtap/taxi-trajectory")

get_porto_taxi_from_kaggle()
df = pd.read_csv(input_file_path, nrows=10, usecols=['TRIP_ID', 'TAXI_ID', 'TIMESTAMP', 'MISSING_DATA', 'POLYLINE'])
df.POLYLINE = df.POLYLINE.apply(eval)  # string to list
df

And now for the remodelling:

def unixtime_to_datetime(unix_time):
    return datetime.datetime.fromtimestamp(unix_time)

def compute_datetime(row):
    unix_time = row['TIMESTAMP']
    offset = row['running_number'] * datetime.timedelta(seconds=15)
    return unixtime_to_datetime(unix_time) + offset

def create_point(xy):
    try: 
        return Point(xy)
    except TypeError:  # when there are nan values in the input data
        return None
 
new_df = df.explode('POLYLINE')
new_df['geometry'] = new_df['POLYLINE'].apply(create_point)
new_df['running_number'] = new_df.groupby('TRIP_ID').cumcount()
new_df['datetime'] = new_df.apply(compute_datetime, axis=1)
new_df.drop(columns=['POLYLINE', 'TIMESTAMP', 'running_number'], inplace=True)
new_df

And that’s it. Now we can create the trajectories:

trajs = mpd.TrajectoryCollection(
    gpd.GeoDataFrame(new_df, crs=4326), 
    traj_id_col='TRIP_ID', obj_id_col='TAXI_ID', t='datetime')
trajs.hvplot(title='Kaggle Taxi Trajectory Data', tiles='CartoLight')

That’s it. Now our MovingPandas.TrajectoryCollection is ready for further analysis.

By the way, the plot above illustrates a new feature in the recent MovingPandas 0.16 release which, among other features, introduced plots with arrow markers that show the movement direction. Other new features include a completely new custom distance, speed, and acceleration unit support. This means that, for example, instead of always getting speed in meters per second, you can now specify your desired output units, including km/h, mph, or nm/h (knots).


This post is part of a series. Read more about movement data in GIS.

MovingPandas v0.13 & v0.14 released!

December has been busy with two new MovingPandas releases: v0.13 and v0.14.

The latest v0.14 release is now available from conda-forge.

These releases are a huge step forward towards making MovingPandas easier to install with fewer mandatory dependencies. All interactive plotting libraries are now optional. So if you are using MovingPandas for trajectory data processing in the background and don’t need the interactive visualization features, the number of necessary libraries is now much lower. This (and the fact that GeoPandas is now shipped with OSGeo4W) will also make it easier to use MovingPandas in QGIS plugins.

New features:

  • #268 New add_angular_difference method

Includes fixes and enhancements for:

  • #267 Improved documentation: direction values are [0, 360)

Behind the scenes:

As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:

If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our discussion forum.

MovingPandas v0.12 released!

The latest v0.12 release is now available from conda-forge.

This release contains some really cool new features, including:

  • New function to add an acceleration column #253
  • We have further improved our repo setup by adding an action that automatically creates and publishes packages from releases, heavily inspired by the work of the GeoPandas team.
  • Last but not least, we’ve created a Twitter account for the project. (And might soon add a Mastodon account as well.)

As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:

If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our discussion forum.

MovingPandas v0.11 released!

The latest v0.11 release is now available from conda-forge.

This release contains some really cool new algorithms:

  • New minimum and Hausdorff distance measures #37
  • New functions to add a timedelta column and get the trajectory sampling interval #233 

As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:

The new distance measures are covered in tutorial #11:

Computing distances between trajectories, as illustrated in tutorial #11

Computing distances between a trajectory and other geometry objects, as illustrated in tutorial #11

But don’t miss the great features covered by the other notebooks, such as outlier cleaning and smoothing:

Trajectory cleaning and smoothing, as illustrated in tutorial #10

If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our discussion forum.

MovingPandas v0.10 released!

The latest v0.10 release is now available from conda-forge.

This release contains some really cool new algorithms:

If you have questions about using MovingPandas or just want to discuss new ideas, you’re welcome to join our recently opened discussion forum.

As always, all tutorials are available from the movingpandas-examples repository and on MyBinder:

Besides others examples, the movingpandas-examples repo contains the following tech demo: an interactive app built with Panel that demonstrates different MovingPandas stop detection parameters

To start the app, open the stopdetection-app.ipynb notebook and press the green Panel button in the Jupyter Lab toolbar:

MF-JSON update & tutorial with official sample

Since last week’s post, I’ve learned that there is an official OGC Moving Features JSON Encodings repository with more recent sample datasets, including MovingPoint, MovingPolygon, and Trajectory JSON examples.

The MovingPoint example seems to describe a storm, including its path (temporalGeometry), pressure, wind strength, and class values (temporalProperties):

You can give the current implementation a spin using this MyBinder notebook

An exciting future step would be to experiment with extending MovingPandas to support the MovingPolygon MF-JSON examples. MovingPolygons can change their size and orientation as they move. I’m not yet sure, however, if the number of polygon nodes can change between time steps and how this would be reflected by the prism concept presented in the draft specification:

Image source: https://ksookim.github.io/mf-json/

New OGC Moving Features JSON support in MovingPandas

First time, we talked about the OGC Moving Features standard in a post from 2017. Back then, we looked at the proposed standard way to encode trajectories in CSV and discussed its issues. Since then, the Moving Features working group at OGC has not been idle. Besides the CSV and XML encodings, they have designed a new JSON encoding that addresses many of the downsides of the previous two. You can read more about this in our 2020 preprint “From Simple Features to Moving Features and Beyond”.

Basically Moving Features JSON (MF-JSON) is heavily inspired by GeoJSON and it comes with a bunch of mandatory and optional key/value pairs. There is support for static properties as well as dynamic temporal properties and, of course, temporal geometries (yes geometries, not just points).

I think this format may have an actual chance of gaining more widespread adoption.

Image source: http://www.opengis.net/doc/BP/mf-json/1.0

Inspired by Pandas.read_csv() and GeoPandas.read_file(), I’ve started implementing a read_mf_json() function in MovingPandas. So far, it supports basic MovingFeature JSONs with MovingPoint geometry:

You’ll need to use the current development version to test this feature.

Next steps will be MovingFeatureCollection JSONs and support for static as well as temporal properties. We’ll have to see if MovingPandas can be extended to go beyond moving point geometries. Storing moving linestrings and polygons in the GeoDataFrame will be the simple part but analytics and visualization will certainly be more tricky.


This post is part of a series. Read more about movement data in GIS.

Geospatial: where MovingPandas meets Leafmap

Many of you certainly have already heard of and/or even used Leafmap by Qiusheng Wu.

Leafmap is a Python package for interactive spatial analysis with minimal coding in Jupyter environments. It provides interactive maps based on folium and ipyleaflet, spatial analysis functions using WhiteboxTools and whiteboxgui, and additional GUI elements based on ipywidgets.

This way, Leafmap achieves a look and feel that is reminiscent of a desktop GIS:

Image source: https://github.com/giswqs/leafmap

Recently, Qiusheng has started an additional project: the geospatial meta package which brings together a variety of different Python packages for geospatial analysis. As such, the main goals of geospatial are to make it easier to discover and use the diverse packages that make up the spatial Python ecosystem.

Besides the usual suspects, such as GeoPandas and of course Leafmap, one of the packages included in geospatial is MovingPandas. Thanks, Qiusheng!

I’ve tested the mamba install today and am very happy with how this worked out. There is just one small hiccup currently, which is related to an upstream jinja2 issue. After installing geospatial, I therefore downgraded jinja:

mamba install -c conda-forge geospatial 
mamba install -c conda-forge jinja2=3.0

Of course, I had to try Leafmap and MovingPandas in action together. Therefore, I fired up one of the MovingPandas example notebook (here the example on clipping trajectories using polygons). As you can see, the integration is pretty smooth since Leafmap already support drawing GeoPandas GeoDataFrames and MovingPandas can convert trajectories to GeoDataFrames (both lines and points):

Clipped trajectory segments as linestrings in Leafmap
Leafmap includes an attribute table view that can be activated on user request to show, e.g. trajectory information
And, of course, we can also map the original trajectory points

Geospatial also includes the new dask-geopandas library which I’m very much looking forward to trying out next.

MovingPandas now supports local coordinates

MovingPandas 0.9rc3 has just been released, including important fixes for local coordinate support. Sports analytics is just one example of movement data analysis that uses local rather than geographic coordinates.

Many movement data sources – such as soccer players’ movements extracted from video footage – use local reference systems. This means that x and y represent positions within an arbitrary frame, such as a soccer field.

Since Geopandas and GeoViews support handling and plotting local coordinates just fine, there is nothing stopping us from applying all MovingPandas functionality to this data. For example, to visualize the movement speed of players:

Of course, we can also plot other trajectory attributes, such as the team affiliation.

But one particularly useful feature is the ability to use custom background images, for example, to show the soccer field layout:

To access the full example notebook, visit: https://github.com/anitagraser/movingpandas/blob/master/tutorials/5-local-coordinates.ipynb

An update to the MovingPandas examples repository will follow shortly.

  • Page 1 of 2 ( 32 posts )
  • >>
  • movingpandas

Back to Top

Sustaining Members