I’ve been wanting to do this for a while, though my motives have been a bit unclear unless I count Internet street cred. The idea is to be able to use HTTPS rather than the HTTP protocol, where the “S” at the end stands for “secure:” the connection between my server and your browser is encrypted (using SSL, or TLS for the pedantic), in a way that keeps the data transfered between them secret, while also verifying that the data source is actually me. This comes in handy if we don’t want third parties snooping on our connection, or modifying the data by adding malware or advertisements. I’m not sure how much those are real problems for my little blog, but all the cool kids are moving to HTTPS so I guess I’d better do it as well.
The process was a bit time consuming, but it turned out to be easier and more straightforward than I thought it would be. The verification is done via a cryptographically-signed “certificate” from an already-trusted source (a certificate authority), and getting a certificate — for free — from a trusted source was the hardest part of the process — but even counting my learning curve (but not my fretting/waffling), getting and installing the certificate took all of 10 minutes. There were a few more hoops to jump through with my host (another 10 minutes, plus more waffling), but cPanel did most of the work, and now it’s done: my site communicates via the secure HTTPS protocol, redirecting from HTTP if necessary. You can see the green lock up in the address bar, which indicates the secure connection.
Now you can enjoy my site, knowing that it truly is me, talking to you, in secret…
Morning weigh-in: 190.5#, 13.5% BF (must have been that midnight egg-salad sandwich)
I get emails here, mostly spam, and mostly through my Contact Form, which then sends the messages along as emails. Things are usually quiet unless I post, but I’ve been posting almost every day lately, so I’ve been getting one or two every day, lately. It’s kind of funny to watch the evolution of the subject/content, as I slowly ban certain spammy words or phrases when I notice a pattern — I used to get way more than one or two, the low-hanging fruit is all blocked, and sometimes the ones that do get through make so little sense they just can’t be useful to the sender. Here are a few that I haven’t erased yet, most of this batch actually look like they’re supporting some business plan, however poorly thought out:
Subject: We Will Like To Get Some Information? Message Body: Hey, I hope your day is going well; we are sending this message to acquire more information…To find out more about us, please visit our website at: [redacted for your benefit]
Subject: just a though Message Body: GoodMorning, I was wondering if you’ve ever heard of [redacted]? its a place where you can purchase digital marketing services for extremely cheap from freelancers.
I just wanted to ask you about it since I know that as a webmaster or business owner (like myself), you’re always looking for ways to save a few dollars on services like online marketing etc.
Let me know what you think about it. I’ve used many gigs on there so I wanted to know your thoughts.I do recommend it though!
I love your site! Have a great day!
Subject: Improve Your Websites Rank in Google Message Body: New link building Software builds links to your website whilst you sleep. Save hours of manual labour and get your site ranked page 1 in Google in days. Submit your links to 1000’s of sites on complete autopilot and get a massive increase in traffic to your website. Try it out for a limited time at a special discounted price. Take a look at this powerful software in action; [redacted]
And just today, my sad new favorite:
Subject: Is It Too Late To Join The Bitcoin Revolution? Message Body: Bitcoin has increased in value by over 3000 in the last few years alone. The Crypto World is exploding with potential right now – this is the time to start and profit! Find out more; [redacted]
Kind of click-baity, but at least they aren’t the word salad I sometimes get… I don’t use anything like Captcha, but I have blocked entire domains and blocks of IP addresses, as well as a whole lot of words and phrases — can you guess any of them from the above examples?
Whoops! That’s been up like that for a day or so now — we’ve been eating pulled pork, more puled pork and leftover pork for the last few days too, so I see a pattern. I did get in a trainer workout Tuesday night, but I also found myself struggling, pushing harder than I thought necessary to get my heart rate up — morning yoga/weights/push-ups seem harder over the last few days too. Natural slump? Over-training? I plan to just work through it.
I went to the allergist yesterday. It was good to see her, get updates on my allergy sensitivities — not much has changed — and get some good advice (and medication) for dealing with my eczema. We’ll see how the new regimen works out.
Meantime, I’ve been working on some Maps for CAT. Here’s a sample; it’s a work in progress but I feel pretty good about it. I actually used my routing program to do the directions — with a little bit of human editing!
Yesterday’s ride along the canal was a slopfest, clothes and bike gray & gritty from the gravel/cinder surface, and I was whooped by a 14 mile ride over soft paths. So today, I’m heading out again — this time with Greg H to some actual trails, which stay dryer and more solid, hopefully. I’m heading out in a few minutes, just after blogging and a little lunch.
A quick aside on the mapping front: I took a long time dithering about it, but I wrote my own chainage routine, and my own ascent/descent calculation function, both in PL/pgSQL, and both — especially the ascent routine, where there was a lot of room for improvement over my PyQGIS script — worked perfectly. (The ascent routine took about 20 minutes to run everything, as opposed to 4-8 hours for QGIS.) I still have to zero out the data at bridges, but I am now back to where I can wait for outside data (recommended routes, etc) to continue.
Another dusting last night, an easy shovel job but the neighborhood looks really pretty, especially on my walk this morning. Anne went early to deal with her office’s walkways, then met Debbie for breakfast at the new breakfast place on Main Street (the Flying Egg, go there it’s pretty nice). Anyway — after I got up, and shoveled here — I texted to see if she needed help; she replied that the job was done and I should come over and join them. Great start to the day, nice to see Debbie, and the point of my story was that it was beautiful out, with early-morning-rosy winter clouds, before it all morphed into a generic “sunny winter day,” which was nice in its own way but that early sky really was cool.
On the home front, we got our new oven yesterday. It looks pretty nice and stainless-steel modern, the range is a bit more aggressive than our old one and, most important, the oven keeps the correct temperature. Too bad the delivery came while I was trying to sleep in — not too early really, but before 9:00, and I was trying to catch up on my sleep after a rough few days…
I’ve had a bit of an eczema problem lately, and it really got crazy this week. We super-cleaned the house, I switched to baths instead of showers… and I broke down yesterday, went to my GP and got some prescription strength cortisone cream, as well as a Prednisone prescription. I’ve been warned about euphoria, mania etc as side effects, but nothing: I’ve basically been just putzing around the house today, though my skin is running through a fast-motion miracle cure so there’s that. I have an allergist appointment in the New Year, and I got a referral for a dermatologist from the GP. I’m going up in the attic soon to find the humidifier. Life goes on.
Meantime, the mapping — rather, the fixing of the mapping scale-up problems — continues. I had problems with getting the elevation changes, and had to eventually abandon a QGIS solution, and build my own PostGIS function to get the “chainages.” The term is apparently a holdover from ancient surveyor days, where they used chains to measure distances; what I needed was a shapefile of points, set every 10 meters along each road in the database, but the new file had to refer back to the road database in a certain way, and the QGIS plugin just wasn’t flexible enough for what I needed. (My solution worked like a charm.) The next step was to use SAGA and my elevation data to give each point an elevation, which since the new chainages were themselves now in the database rather than a standalone file, the process was its own struggle learning experience, but it’s done now. Next up is generating the ascent/descent data, which I might decide to do in PostGIS as well — my current, PyQGIS-based method is run-all-night-check-results-in-the-morning slow. Tomorrow, or this weekend…
Mapping: I had, and still have, a few technical issues to deal with, but the full Lehigh Valley database is now in PostGIS, along with elevation data — bogus elevation data, that’s one of my technical issues — and the demo map can now route with the new database. But it’s got the slows, it’s got the slooowws… With about 3200 road segments in the “toy database,” it could route in about 1-2 seconds, but the full-map version took about 6 seconds per routing task — and there may be multiple routing tasks in each route, from start point, to via point and then through subsequent via points, and finally to the endpoint. Unacceptable!
I did some searches online, and sure enough there are a lot of people complaining about pgRouting performance and looking to speed it up. The general consensus: there are a few things you can do, including tune your database, but the actual bottlenecks are the pgRouting algorithms. Some suggested using osm2po, another program that converts OpenStreetMap data for databases but can also do routing: tried it and it’s blindingly fast – d’oh! (Unfortunately, I didn’t see much there in the way of customized, dynamic cost functions, so I can’t see how to turn it into the the answer I’m looking for.) I tried a bunch f the Postgres/PostGIS performance-tuning tips anyway, and they did seem to help a little.
I eventually came across one potential solution: route only on a subset of the roads in the database, using a bounding box. For each pair of points to route between, I find the smallest rectangle that contains both, then expand it by 2000 meters in every direction (like a buffer zone); this is my bounding box, and the routing search is limited to the roads that touch or fall within that box. This seemed to do the trick: my routing times are back down to about 1-2 seconds.
Except near — wait for it — those confounded bridges. The valley is broken up by the Lehigh river, with occasional bridges, and if there are no bridges within the bounding box for a route that needs to cross the river, no route will be found. Meanwhile, when routing points are on a diagonal, the bounding boxes are fairly big, but routing points that run mainly east-west or north-south produce long, skinny bounding boxes. I found a few “dead zones” where routes couldn’t be found, especially east-west ones north of Northampton, routes with skinny bounding boxes where the bridges are a little sparser. My original bounding boxes were expanded by a buffer that was only 1000 meters; I went to 2000 meters in an attempt to alleviate the bridge problem. This didn’t solve it entirely, but it did help, and there was no real performance hit going from 1000 to 2000 meters. I’ll probably look at distances between bridges, and revise my buffer zone to be just bigger than say, half that distance.
Reading: I picked up Don DeLillo’s Underworld again, intending to just read the first part. I love the first chapter but never finished the book because I found the rest boring; now I am engrossed and don’t know what I was thinking back then.
Listening: WXPN has been playing “The 70’s, A-Z” this past week, every song they have in their library that was released in the Seventies, played in alphabetical order. We’ve been following along religiously, and it’s been fascinating and fun but they’re only up to “T,” and it gets wearing. Full disclosure: the radio is off right now…
The only time they weren’t playing the 70’s was for their Friday “Free at Noon” concert at the Word Cafe, which this week featured Russ’s band Cherry. So, we went down to Philly with Ray and Lorraine, where we met Frank and Patricia, and Ben, and Gabby, and we all watched the show and then went out to lunch with Russ at the White Dog Cafe. As always, we spent a few minutes at Penn Books before the ride home. All the talk in Philly, among us and overheard on the street, was about the upcoming snow on Saturday…
By the way, Saturday was Luminaria Night in Bethlehem, here is a photo of ours:
Just kicking back this morning, before going with Anne over to the Bike Co-op for the afternoon…
Reading: I just finished N.K. Jemisin’s debut novel, The Hundred Thousand Kingdoms. I took to it well enough at the beginning, but it actually became a chore to read: I put it down for a week, and read the last third in two sittings, closing the book with a sense of relief yesterday. Strange because I really liked her award-winning “Broken Earth” trilogy, and the style and voice were very similar; Anne said that maybe the author worked a few bugs out of her writing between her debut and the trilogy, and that may be so but I didn’t really see it. All I can say is that I really recommend the trilogy, but don’t feel the same about this one. I think it’s also first of two, but it’ll be a while before I read the sequel.
Two Hours Before The Mast: I did my usual Wednesday volunteering at the Canal Museum yesterday. The canal boat is now in dry-dock for the winter, and Scott E is trying to get as much maintenance done on it (especially things like painting) in the nicer weather as he can, so yesterday I helped prep the deck for staining. Mostly this meant sanding, and the sanding I did was mostly “trim work” with a small vibrating sander, near fixtures and in corners where the bigger unit couldn’t fit — I did this for about two hours until the little sander overheated and turned off. I thought of it as “swabbing the deck,” but showed remarkable restraint and did not talk like a pirate.
Mapping: The routing website is now essentially — well, not done done, but the functionality is pretty complete. It routes, with a few glitches (but I added error handling so it doesn’t just choke without apologizing), it modifies routes based on user preferences for hills and visible recommended streets, and it can export the route as GPX; the final steps for website usability are to add printing capabilities for the directions, and add some explanatory content. (Finishing the job means building the real database — and finding a place to put it online.) I’m pretty happy with how this came out so far, it’s actually fun to play with.
Listening: Not to eMusic, that’s for sure. I’ve used them for years to purchase music, and once they were both a good deal at a flat 49 cents a song (with no DRM: download it and it’s yours), and a good source for whatever I was looking for. Then in about 2010, they bought into some of the more mainstream catalogs, changing their price structure — more popular stuff became more expensive, some songs required you to buy the entire album — to accommodate the new sources. This actually drove away many of the better and more obscure labels, leaving eMusic no better than any other generic source, at least in terms of selection. Now the major labels are gone again (I think), and the catalogs are mostly things I don’t care about. So every month I pay $15, which gives me $17-$18 in credit to use or lose that month, and I hardly ever even check in anymore to see their new offerings — and whenever I go there to search for something specific, they don’t have it. It’s time to move on.
On the Home Front: We are busy researching ovens, in preparation for our new purchase.
Talk about hedonic adaptation! A week or so ago I was sure I was far from ever being able to route on my web map using pgRouting, then a triumphant breakthrough, and now here I am, annoyed that it’s not perfect…
My first problem is a data issue, and a recurring one in my mapping and routing life: dealing with bridges. Once I got the routing to work, I started to customize it with a separate “get_cost” function, which deals with ascent and descent (other criteria are coming), and that worked fine. Then I noticed that the routes seemed a little off, like they were avoiding what I thought would be the optimal routes, the main one being that it would do a lot to avoid a certain section of Broad Street. That’s when I remembered: there are several bridges on that section, and rather than following the elevation of the bridge’s road surface, my elevation data followed the contours of the ground below it, leading to large ascents and descents along that section.
To solve the immediate problem, I changed the ascent and descent to be zero for the section containing the bridge — close enough to the truth, for that short a span on a mostly flat road. That made the routes in the vicinity more sensible, but what to do about other bridges?
I think I have three options: I can either find the actual elevation data for the top of the road surface (using a “digital surface model” rather than “digital elevation model” and probably using LIDAR rather than satellite radar data), or I can assume that the bridge has a mostly constant slope, and calculate the slope from elevations where it attaches to the ground, or I can save myself a lot of work and just say “they’re flat, or flat enough to make no difference,” and make all ascents and descents be zero for bridges. I am still thinking about this…
The second problem is a little harder to figure out, since it involves the PostGIS routing function I got off the Internet: when the beginning and end points of the route are on the same segment of road (ie there are no intersections between them), the function fails. I don’t know enough about Postgres functions to be able to solve this, so I may have to either live with it for a while, and contact the person who wrote the function for some help.
I thought this would take so much longer… I have a web map designed and working, and I was able to add generic routing functionality to it (via the Leaflet Routing Machine plug-in). Here’s a screenshot:
Once I got routing up and running, I came across a few problems, mostly with how the directions are displayed: the display only showed street names and distances — that is, no turn information — and, for wherever the route continues straight on the same road through an intersection, there is an unnecessary instruction to “go straight.” Both of these were due to errors in lrm-pgrouting which I managed to fix to my liking.
So, success! But this is using a generic routing function, rather than the “climbing vs busy road vs recommended commuter routes” function I ultimately want to use, so there’s some room for improvement. I also think I may want to abandon the Leaflet Routing Machine and do the input and display on my own. Also also, all this stuff currently resides only on my laptop; I have to find a host that will let me run a PostGIS database (among other things). I still have some work to do.
The warm weather finally broke (again) with this recent rainy spell, let’s hope it lasts but for now it’s nice and cool…
While we were in Pittsburgh I kind of got fed up with my phone sending me messages about memory use — it wasn’t really all that close to full, but it was getting closer every day, for no reason I could see, and the messages were getting more ominous. I had already moved as many apps as I could to the external drive, my photos, music etc, there should have been very little on the internal drive at all, much less enough to cause problems. What gives?
I started looking through the folders on the drive, Googling their names and trying to find what was going wrong. Turns out (among other things) that my photos were being stored several times on my phone, over a thousand photos, each a few megabytes, and while my regular photo storage is on the external drive, the backups were filling the problematic internal one. Several minor changes to the settings, turning off “cloudagent” or whatever, and I recovered a huge chunk of storage space. I was so happy I deleted a bunch of apps I don’t use, freeing up another chunk. The best part? They’re staying freed up.
Meanwhile, back home on the laptop… my hard drive has two partitions: one large partition where my old system was, and another one that holds the root of my current system, which is smaller than the first partition but by no means small. Unfortunately, it was also constantly growing, and I was down to like 25 gigabytes — which sounds like a lot, but the disk is old and on the small side, and still it’s 350 gigs. I’m down to less than 10 percent usable space?
Once again, it was the cloud. I’ve been putting it off for a while, but I knew that the problem was my Dropbox folder, which was huge and growing (photo backups from my phone), and things would be a lot better if that folder was on the bigger, emptier partition. This required some work — the Dropbox preferences program wasn’t working correctly, so I had to fix that first, then the move itself took a while — but in the end, the move was a success, and I freed about 25G on my main partition.
That got me excited enough to look at what was on my auxiliary partition, maybe there was even more I could free up… I have my old system backed up on a network drive, but the old home folder (about 65G) was still there, and all the useful stuff had been moved over to the main partition already. I couldn’t bear to just erase it, so I moved it all over to the network drive (where there are now two full backups, but with two terabytes it’s a drop in the bucket) and now I have tons of free space — 190G of 350G total — on my drive.