• Tag Archives QGIS
  • Even More Sals GIS Fun

    My map looked so good in the QField app that I thought it might be nice to build a web map, one that could be generally available rather than part of a very specialized app. And rather than doing it from scratch, I decided to try some of the QGIS plugins.

    My first try was with the QGIS Cloud plugin. I’d used this before (verdict: meh), but I still had an account so I decided to give it another try. Verdict is still “meh” but I did get a web map out of it, check it out here. This looks as good as my QField map, which isn’t surprising since it’s basically my original project running on QGIS Cloud’s servers, but this setup came with a lot of latency: the map takes a while to redisplay after every move or resizing. It also had some trouble showing my location when I first launched it on my phone (it worked fine on the laptop) but this problem eventually resolved itself — it might have been a permissions issue, and I might have solved it by pressing random buttons…

    The other plugin I tried is called qgis2web, which builds a local web map using the standard Leaflet or OpenLayers javascript libraries. This sounded like a great approach, but as soon as I ran the plugin, it crashed QGIS — doh!

    It turns out that qgis2web can only work with very simple feature styling: lines (for instance) can be dotted or solid but not a mix, and can only be one color, while my trails were dotted lines in one color, drawn on wider solid lines in another color…

    So, I created yet another Sals sub-project, with a subset of my map’s features (just trails and roads, streams, and trailheads) and a much-simplified symbology. Just for fun I tried building an OpenLayers map, since I’d never used OpenLayers before. It came out great, and though it doesn’t look as fancy as the QGIS Cloud map it loads/runs much faster. I put this one online as well, you can find it here.

    Meanwhile, Ben and Jenny arrived yesterday for their Christmas visit, and today we went for a hike at Sals. I used my QField app to record a bunch of marker posts — I didn’t want to turn the hike into a “Don plays with his maps” debacle, so I didn’t break out either of my new web maps. But capturing the data in QField was a snap, and incorporating it into my main project was mostly seamless, and I’d guess I now have about half of the total number of markers added.

    And that means that my two new web maps are already out of date…


  • More Fun With QGIS

    Anne and I did a hike a few days ago at Sals, very pleasant and a good workout — we got in about 6 miles — but while there I was reminded of a project I’d been meaning to get started: about a year or so ago the VMB put in numbered marker posts at trail junctions, and I had been planning for a while to document them for my Sals map.

    (These numbered posts were something I’d advocated for maybe a decade ago, when I was involved with the trails up there. My advocacy didn’t go anywhere at the time, but it is good to see that the plan eventually got implemented.)

    Logging the post locations could be done using my GPS, and I’d done a few like that some time over the past year, but I am now seeing this as another opportunity to play with QField. And that’s my new project.

    The first thing I did was to clean up my Sals map project in QGIS: I have a big mish-mash of data layers in different formats (mostly geoJSON and shapefiles) and multiple coordinate reference systems, so the first thing I did was to convert them all to the same reference system (the one used by GPS devices), and then put them all in a geopackage, a sort of portable database file. This really cleaned up my project.

    I then created a new project (that also used my new Sals geopackage), and created a second geopackage, one that will be editable in the QField app (as opposed to my official data geopackage, which will be locked down), with a layer to record the new marker posts. This basically means I will capture the data in a scratch file before moving it to the official package. This approach wouldn’t work well for updating existing data, but I think it’ll do well enough here and doesn’t leave my official data as vulnerable to field mistakes.

    So I did all that, and then went through the hoops to get my new Qfield project onto my phone. Opened it up — it looks beautiful, it actually looks better on the phone than on the laptop.


  • Fun With Rasters

    I’ve been experimenting with raster data lately, photographing trail maps from Indian Paths of Pennsylvania and then digitizing them for use as map overlays in my project (I rough in the paths by tracing over them on the maps). This has worked really well, at least for when there is a path on the map — alternate paths are sometimes missing — but it came with a few problems:

    • The digitized maps (georeferenced to match my map and converted to GeoTIFF format) look great, but the first one I did weighed in at a whopping 27MB. Since I expect to generate at least a hundred of these, that’s a significant amount of disk space.
    • The maps start out as color photographs, and once they are in map form there is a lot of extraneous stuff that overlays (and blocks) the basemap beneath it.

    So, I came up with a workflow that brings in my map images while avoiding these problems:

    1. I start by taking a photo (with my phone) of the map in question, trying to get “nothing but map” in the shot.
    2. Using GIMP, I rotate and crop as necessary, then clean up the photo by making off-white sections white, despeckling, and increasing brightness/contrast. I then invert the colors, making it a B&W negative before saving.
    3. In QGIS I georeference the modified photo, using river confluences and other geographic features as my reference points. (I try for six or more “ground control” points to reference, and use the 2nd-order polynomial transformation to account for bent pages in the photo, though if the resulting transformation doesn’t look good I’ll try other options.)
    4. Finally I convert the resulting TIFF from RGB format (colors) to PCT (a sort of numeric) format, and save at half the original resolution.

    I can load the resulting raster as an overlay, and the raster pixels should be one of only two values (zero and one). I make the zero values transparent and the one values black, and now I have a very usable map overlay. The final GeoTIFF files average about 100KB each.

    This makes tracing the paths very easy, maybe too easy: I feel a temptation to take the paths as gospel, even though I have no real idea of either the original map accuracy or the accuracy of the georeferenced overlay. Then again, it is the information as given in the book, and that’s what I set out to capture. Anyway, it’s a good first step. I’ve done about a dozen so far.


  • Next Steps for The Native Paths

    So I have about a dozen left of the native paths to add to my database, in this first pass through Indian Paths of Pennsylvania (that book I’m following/analyzing/whatever). This is the pass where I go through the book from beginning to end, adding the basic info for each path to the database, adding the info about the start points and destinations, and generating the routes described in each chapter’s “For the Motorist” section.

    There are a few big pieces left to this project, which I think I can do all together in a second pass through the book:

    1. I need to document the relationships between the paths, as described for each path in the book (which is what required me to go through the book one first time: to get all the paths documented, before trying to map the relationships).
    2. I need to generate the actual footpath routes. This will probably be the most difficult and labor intensive task in the whole project, and I expect it will likely involve digitizing all the (low quality) maps in the book; it may also require trying to find primary sources, old deeds and land grants etc, and even after all that I expect I’ll have to live with a great deal of ambiguity in the routes.
    3. I’m not sure if I want to do this yet, but as I go through the book I may document any points of interest (landmarks, native towns that aren’t trail endpoints) that I haven’t already included.

    I’ve been thinking about the first part for a while, and have set up a separate bridge table in the database to capture these relationships; the table is set up with links to a subject path (the one that’s doing the referring) and an object path (the one getting referenced), and a link to another table with the list of possible relationships between them: intersections, alternate routes and spurs; aliases and alternate names; concurrencies (ie where the path shares some section of trail with another path); and the ever-popular “for more information see also.” I can add more relationships as I see the need.

    For the second task, I think I’ll want to use the maps in the book, even if it’s just to trace over. That means scanning the maps in some way without damaging the book (I may just photograph them with my phone), then georeferencing them and saving the result somewhere. I suspect I’ll end up with a pretty big set of raster data, and now need to consider how to do to organize it. Rasters are not something I have much experience with, so there will likely be a learning curve involved — I think I may put them in the database in some way.

    In terms of original research, my plan at the start was to use Indian Paths of Pennsylvania as my sole source — my project would be the book translated into GIS form — but I’ve already used other sources (e.g., Wikipedia, town websites) to flesh out histories and descriptions, and I think I’m seeing the book now as a condensation of other info, even if it’s just the author’s research files; the info in the book may have been “condensed,” oversimplified, to the point of vagueness, more exact versions of the trail descriptions exist somewhere. I really don’t want to get into actual archival research for this though, and it may just end up that if I dig really deep, I’ll only find that all the primary information is pretty vague too…

    Finally, that third task has me a bit stuck: I’d originally planned to only record the endpoints of the paths (as given in the book), and even called my points table “termini.” Now I’m looking to enter things like landmarks, known trail junctions — there are several places called “the parting of the ways” — towns that aren’t actually endpoints, and all sorts of other points of interest. I painted myself into a corner with that “termini” name, and even if it wouldn’t be too much of a stretch to stick these other points in the termini table, I may add a separate “points of interest” table, or at least add a column to the termini table to designate non-endpoints. (Maybe I’m overthinking this, I could easily find the endpoints and non-endpoints within a mixed table, just by using simple searches.)

    This kind of gets to what I want my native paths data to eventually look like. Many of these landmarks and points of interest are likely to be nodes in a trail network (just like the endpoints), so I am back to thinking they should all be part of the same table. I also expect that I’ll have trail segments from node to node, and my final paths will be lists of trail segments from start point to end point, so my final product will not look much like what I’m building now.

    Well, I still have some time to think about it.


  • Indian Paths Update

    I’m still cruising along on this project: I’ve got just over 110 paths in the database (of maybe 150 total), about 130 towns or other path endpoints, and 92 motorway routes. I have added no actual paths yet, but the motor routes are starting to look like a real network.

    My current plan is to parse the book three times: once (this time around) to capture the paths, path endpoints, and motor routes; once (the final, and probably most difficult, round) to try and develop the original foot paths; and in between these rounds I will go through the paths/chapters and try to capture all the cross-references between them.

    I noticed early on that there were a lot of things like “this is an extension of that other path,” “so-and-so path also goes by this name,” “this path intersects with these others,” and such like throughout the text; the path descriptions are festooned with these kinds of cross-references.

    (I also finally picked up on the fact that paths without a path/chapter number are not actually part of the previous chapter, but are basically “chapterless,” just the next path name in alphabetical order. They act sort of as placeholders, the alternate names of other, more fully fleshed-out paths — that is, more cross-references.)

    I want to hold on to all this cross-reference information in my database, so I set up a bridge table to work something like a resource description framework, with the referring path as the subject, the referenced path as the object, and for the predicate I would use a description of the relationship type, such as “[subject path] is a continuation of [object path],” “[object path’s name] is an alternate name for [subject path],” “for more info see [object],” and so on. I now have all of this set up and ready to go, but before filling it in with information I want to have all the paths already in the database. Soon…

    Meanwhile, the details, of each path or town I add, have all been real eye-openers. I often do a little internet research on each town, or village, or Native name I come across, and each bit of info, each piece of the puzzle is another portal into that era.


  • On Second Thought

    Well, that didn’t take long…

    A few days of actually using the Input app, and I’m ready to throw it away and go back to QField, despite QField’s clunky data transfer method. The need to put my collected data into a PostGIS database (without too many hoops to jump through) is more important than I realized, and Input’s data entry UI had a few quirks that just became more unpleasant every time I used it — there was something just plain off about its text boxes and typing…

    It’s a shame too, because the Mergin update process was exactly what I wanted. (Input can also handle QR codes as data sources, something I have absolutely no use for — but hey, nerdgasm alert.) Well, QField is eventually supposed to get its own cloud service, maybe they’ll be able to upgrade their data transfer process once they have that in place. My luck, they’ll reproduce Mergin’s setup, and reproduce Mergin’s PostGIS problems along with it.


  • Data Collection II: Input

    Just picking up where I left off here

    I installed and started working with the other field geo-data collection app, called Input. It has a few drawbacks (so far) compared to QField, but it’s just as easy to use on the phone, and much easier to set up and transfer data back to the home computer.

    Input is basically a phone app front end for Mergin, a cloud-based data storage service built to integrate with QGIS. Mergin stores the data that then can be synch’ed with QGIS on the desktop, or with the app. Data collection does not need to be sent to the cloud in real time if you have no data connection, but can be done later, and is “just push the button” easy, as is synching between the cloud and QGIS. (This is the biggest advantage over QField, which has a pretty clunky update workflow.)

    The biggest problem I found so far has been that the data is not easily uploaded into a PostGIS table; the project relies on GeoPackages for data storage, even on the QGIS end. (Strictly speaking, there is a way to use PostGIS, but it seems involved, and needs to use another program that I have not yet tried.)

    There is also the issue of cloud storage: a free account is limited to 100 MB, and while the actual “location and description” data is usually pretty small, all the accompanying photos, at 5-10 MB each, will quickly bump up against that limit. If I could offload the data and photos, from the cloud-based project into something else (like my computer, and PostGIS), this limit wouldn’t be so pressing, but the workflow is starting to look unwieldy again…

    (Speaking of unwieldy, there is no direct way to store basemap tiles for offline use. Again, there is a way, but it’s not straightforward, and storing the tiles will also consume a significant portion of that 100MB limit.)

    None of these problems are deal-breakers (so far), and the single advantage of easy synchronization more than makes up for the lot of them. I’ll be playing with Input/Mergin some more, but I think it’s the one I’ll decide to keep.


  • Data Collection

    Part of what I do, as a member of the D&L trail patrol, is document issues along the trail — down trees, washouts — that may need to be addressed by the land managers. There is a specific report form for this kind of thing, where we enter a description, location (GPS coordinates) and maybe one or two photographs; you can fill out the form on the trail if you have a data connection — a big “if” on the trail sometimes, so I usually do it at home on the laptop, where all things computer are easier anyway.

    My typical workflow: I stop and take a picture, and later at home I use the photo’s EXIF data to get the location. This can be a bit of a pain, so I was thinking that maybe there is some app where I could create an entry, with location data, photos and maybe a timestamp, all added on the spot without need for data connectivity; I can then call the note up and refer to it later at home. (I have a “notes” app, where I can add photos, and paste my location, from say Google Maps or whatever, into a note, but I want the whole thing to be more integrated than that, with less human intervention.)

    There actually are some “geo-notes” apps, but I started overthinking things as usual, my wish list expanded, and then I discovered that here are two apps that actually integrate with QGIS: QField and Input.

    I am currently working with QField. You build a data-collection project in QGIS, then you run the “Qfield synch” plugin to export the project into a format that the app can use. Move the exported files to your phone, do your data collecting, move the updated files back to your computer, and import the updated version back into the original project. This process (export, move files, get data, move files back, import) is tiresome, especially since every data collection effort requires you to go through that entire process — I would much prefer something like “build project, export to phone, then: get data, upload data, get data, upload data…”

    Actually using the app, however, is easy. I set it up to record a timestamp with every point collected, then add a description and optional photos, and it works flawlessly; the only limitation I found so far is from the phone’s GPS, which is sometimes inaccurate. Once the data is back on my laptop, I can massage it (mostly automatically) into the form I need for the trail report.

    Next up is Input. This looks like the more promising app (based on their website), but the grass is always greener on the other trail…


  • Experiments in Routing, Part 2

    This is my second post in a series, where I report back on my results from playing with various ways to use routing, in QGIS and related programs. My immediate task is to identify those cycling amenities that are nearest to access points along the Lehigh Towpath. You can read Part 1 (the introduction) here. In this post, I’ll be using the QGIS built-in Network Analysis Library. Follow along after the break…

    Continue reading  Post ID 5500