Monday, December 19, 2016

Scroll-Animated Image-Based Movies

Last week I had an interesting request that was fun to figure out. The GIS folks had put together a series of images forming an animation, as one zooms in on the California coast. Inspired by parallax image effects, they wanted to know if the image animation could be tied to scrolling the page, so the animation plays a frame at a time as you scroll.

We found a few existing toolkits that did this such as ScrollMagic, but found that the animation was not at all smooth, and that it didn't work so well for some of the cases the web designers specifically had in mind. A few of them didn't work on mobile at all.

So I ended up creating my own, and came up with two prototypes you may find interesting.

Scroll-Animated Header


This first prototype, has a top-and-center banner image on the page as with a lot of sites. But the page content locks to the height of your screen and forces scrolling, and the banner animates as you scroll.

https://github.com/GreenInfo-Network/ScrollAnimatedHeader

The effect is pretty neat, right?

This one was meant as a proof of concept and a demonstration of a technique, and not a redistributable library. Still, if you're interested you can View Source above or hit up the repo: https://greeninfo-network.github.io/ScrollAnimatedHeader/




jquery.Reanimator


Turns out that what the web folks really wanted, was for multiple, smaller image-animations to be sprinkled and embedded throughout the page content. The same way you'd normally add some float:right and float:left images throughout your text, they wanted image-animations.

So here's a look at a new jQuery plugin I created: jQuery.Reanimator.

https://greeninfo-network.github.io/jQuery.Reanimator/

 Unlike the previous one, this is specially designed to work with multiple such animations, and to not make assumptions about the page flow (e.g. no fixed header and no fixed scrolling zone). And it's implemented as a jQuery plugin, so it's almost copy-paste easy to apply to any one or any several such animations.

jQuery.Reanimator sports some neat features as well.
  • Animations can have mousemove and click effects, to play their animations. This provides alternate ways of triggering the animation should scrolling be insufficient.
  • Ability to add padding to the top and bottom of the scrollable area for an image.
  • These options can be set in the constructor, and also in data-reanimator-XXX attributes to override that global setting. Makes it easy to customize individual animations.
  • A "debug mode" to show the scrollable areas, etc. to give insight into testing and tuning the animation for your website.
For more info and to download it for your site, hit up the repo:

https://github.com/GreenInfo-Network/jQuery.Reanimator










Wednesday, November 23, 2016

Many things brewing

Happy harvest festival!

I realized the other day that it's been 5 months since my last posting. Things have been happening, but many of them are still in progress so I've kept quiet. But here's a sampling of some of what I've been working on.

AccordionLegend control for Leaflet

 

This was actually my prior blog post, back in June. It's a Leaflet control that displays a spiffy collapsing legend, with layers broken into accordion sections, with legend swatches and opacity sliders and all.

http://greeninfo-network.github.io/L.Control.AccordionLegend/

Just today, someone reported a bug with it and I fixed that bug, then went on to improve the demo, and add a few more features.

Early Warning System v2


The EWS scrapes project disclosures from several international funding banks such as the World Bank, International Funding Corporation, African Development Bank, Asian Development Bank, and others. The material is reviewed by the International Accountability Project, looking projects that could threaten human rights, e.g. involuntary relocation of a whole town, and connecting with local activists.

http://rightsindevelopment.org/?page_id=2421

I did the original EWS system back in 2011, but they want to clean things up, expand on the system, add support for other folks editing it, etc. Internal goals included some code cleanup, a nicer model structure, better error handling, a nicer search UI, and more. And, this project is in Django, which I just love, so there's a bonus.

The EWSv2 is coming along well, and preliminary feedback is quite positive. We should have the thing launched to the public in a month or three.

Tobacco retailer visualizations, next-generation system

 

A long-time client of GreenInfo Network, CounterTools has us make interactive web maps of tobacco retailers, inspections and undercover purchases of tobacco, and derivative stats such as "percentage of undercover sales which sold, by county" Our oldest maps with them go back to late 2011 or early 2012, with the latest of the current generation having been set up only a few weeks ago.

I'm working on the next generation of both the data management and mapping components. The project involves 3 agencies for 3 sections: back-end, front-end, and GIS/mapping. The mapping fits in the middle, firmly in both front-end and back-end: data models and tables, file uploads and data processing, data searching and reporting, and front-end cartography.

Unlike the current generation, this one will work with a living dataset, with new retailers and retailer inspections being added daily, so there's a whole new dimension of intelligence in the upload capabilities and error handling, in the calculations and rendering, and so on. Also unlike the current generation, the front-end is in AngularJS, and the back-end is in Django. (yay!)

I've been learning a lot from how the other teams do their job: AngularJS and Sails for the front-end, Django REST Framework for writing new APIs, their particular tactics for layout out the sections of the Django app, and so on. It's been a learning curve there, but I'm appreciating the education.

Coder Commando


I've also been handling the usual flow of miscellaneous requests that aren't large projects, but "my coder quit and I launch in two weeks!" situations. This happens a lot more often than you would imagine; four projects in the last five months, come immediately to mind.

I don't usually blog about these since they're not usually the lengthy and involved projects, but they do keep me busy. Some of my recent ones have involved some neat moving parts:
  • A tool to draw a polygon on a Google Maps map, and derive statistics from census blockgroup data, then download a PDF.
  • Learning about Expression Engine, a CMS from the folks who made CodeIgniter, and writing a bunch of fixes in various pages, some of them maps and some not.
  • A map of farms and farmers' markets statewide, with a nice CMS on the back end where one manages five pages of details about their farm. Notably, the part that lets you upload photos of your farm, has a UI for zooming and cropping the photo, so your 200x100 thumbnail looks the way you want it.

Happy Harvest

 

So yeah, I've been silent but not idle. I'm told of at least 2 new projects waiting for signatures, and some these projects in progress will likely start going live over the next few months. Once the lid is off, I can describe them in more detail.

Have a nice holiday!





Tuesday, June 14, 2016

New Leaflet control: L.Control.AccordionLegend

Some time back I came up with a custom legend control for Leaflet. Unlike the basic L.Layers control, we wanted it to be really snazzy:
  • The layer list broken into sections, not by basemap/overlay, but thematic similarity: Parks in a section, Admin Boundaries as a section, Health Statistics as a section, etc. And the sections should have accordion-style behavior: clicking a section title should expand and collapse, show only one section at a time so as not to overload the screen, and so on.
  • Each layer has an opacity slider.
  • Each layer has a legend.
So, here it is:
https://github.com/GreenInfo-Network/L.Control.AccordionLegend
http://greeninfo-network.github.io/L.Control.AccordionLegend/

Enjoy!

Friday, June 3, 2016

New Leaflet control: L.Control.CartographicScale

A request we get now and then, is to add a readout of the current map scale denominator, e.g. a readout that changes from "1:24K" to "1:12K" as you zoom in a notch. These are handy for cartographers who still think in terms of scale, instead of zoom levels.

OpenLayers had the handy-dandy OpenLayers.Control.Scale but I couldn't find a similar such control for Leaflet. So I wrote it.

http://greeninfo-network.github.io/L.Control.CartographicScale/

https://github.com/GreenInfo-Network/L.Control.CartographicScale



I should point out that the accuracy of this sort of thing, has always been dubious. Between the projection itself (a "square degree" in Lagos is not the same area as in Barrow, and Greenland is not that big in real life) and the realities of screen resolution (96dpi is for mid-90s CRTs) these scale readouts have always had very poor accuracy and always will. It's the mathematics of a world that isn't flat, you know?

Still, the control is accurate enough for cartographers to get the sense of "2K or 20K?" and that's what was really needed here.

Friday, May 20, 2016

In-Browser filtering of raster pixels in Leaflet, Part 3

In my last posting, I mentioned described the process of creating a three-band TIFF and then slicing it into three-band PNG tiles for display in Leaflet. The tiles are not visually impressive, being almost wholly black, but they have the proper R, G, and B codes corresponding to the location's PHZ, PCP, and GEZ. (if you don't know what those are, go back and read that post from a few days ago)

So now the last step: transforming these tiles based on a set of arbitrary RGB codes determined by the user. Let's go through the working source code for the L.TileLayer.PixelFilter and I'll describe how it works.

First off, we need two RGBA codes so we can color the pixels as hit or miss, and we need the list of [r,g,b] trios that would be considered a match. The initialize method accepts these as constructor params, and the various set functions allow this to be set later as well.

(The decision of what list of RGB codes should be passed into setPixelCodes() is a matter of business logic: selectors, clicks, lists... outside the scope of this discussion. See the demo code for a trivial example.)

Next, we need to intercept the map tile when it's in the browser. Thus the
tileload event handler set up in the constructor, which runs the tile through
applyFiltersToTile() to do the real work.

applyFiltersToTile() is the interesting part, which only took me a couple of hours sitting down with some HTML5 Canvas tutorials. Let's dissect it one piece at a time:
  • "copy the image data onto a canvas" The first paragraph creates a Canvas of the same width and height as the image, and copies the data into it. The IMG object itself won't let us access raw pixel data, but once it's in a Canvas we can.
  • "target imagedata" We then use createImageData() to construct a new and empty byte sequence, which can later be assigned back into a Canvas using putImageData() This is effectively a straight list of bytes, and as we write to it, we need to always write 4 bytes at a time for every pixel: R, G, B, and A.
  • "generate the list of integers" Before we start looping over pixels, a performance optimization. Arrays have the indexOf() method so we can determine whether a given value is in an array, but it only works on primitive values and not three-element arrays. The lodash library has findIndex() which would work, but it means a new dependency... and also the performance was not so great (256 x 256 = 65536 pixels per tile). So I cheat, and translate the list of desired
    pixelCodes into a simple list of integers so that indexOf() can work after all.
  • "iterate over the pixels" Now we start looping over the pixels in our Canvas and doing the actual substitution. Again, we step over the bytes in fours (R, G, B, A) and we would assign them into the output imagedata in fours as well. Each pixel is translated into an integer, and if that integer appears in the pixelcodes list of integers, it's a hit. Since indexOf() is a native function it's pretty swift, and since our pixelcodes list tends to be very short (10-50 values) the usually-avoided loop-in-loop is actually quite fast.
  • "push a R, a G, and a B" Whatever the results, we push a new R, G, B, A set of 4 bytes onto the output imagedata.
  •  "write the image back" And here we go: write the imagedata sequence back into the Canvas to replace the old data, then reassign the img.src to this Canvas's base64 representation of the newly-created imagedata. Now the visible tile isn't based on a HTTP file at all, but from a base64-encoded image in memory.
The only gotcha I found, was that upon calling setPixelCodes() the tiles would flash to their proper color and then instantly all turn into the no-match color. It worked... then would un-work itself?

The tileload event wraps the IMG element's load event. This means that when I assigned the img.src to replace the tile's visible representation... this was itself a tileload event! It would call applyFiltersToTile() again, and this time the RGB codes didn't match anything on my filters so all pixels were no-match. Worse, the assignment of img.src was again a tileload event, so it was in an infinite loop of processing the tile.

Thus the already_pixel_swapped flag. After processing the IMG element, this flag is set and
applyFiltersToTile() will skip out on subsequent runs on that IMG element. If we need to change pixel codes via setPixelCodes() that calls layer.redraw() which empties out all these old IMG elements anyway, replacing them with fresh new ones that do not have already_pixel_swapped set.

So yeah, it was an educational day. Not only did we achieve what the client asked for (the more complex version, not the simplified case I presented last week) and as a neat reusable package, but the performance is near instant.


Tuesday, May 17, 2016

In-Browser filtering of raster pixels in Leaflet, Part 2

A few days back I posted a link to L.TileLayer.PixelFilter This is a Leaflet TileLayer extension which rewrites the tiles after they have loaded, comparing each pixel against a set of pixel-codes and replacing the pixel with either a "matched" or "not matched" color. It's pretty useful for generating dynamic masks and highlights, if your back-end data is a raster and not vector data.

I had said that the client's request was to display plant hardiness zones, and that I was using the Unique Values colors as the RGB codes to match against zone codes. I lied. The reality was much more complicated than that, but would have been distracting from the end and the result. So here's the longer story.

The Rasters and the Requirement


The client has 3 rasters: Plant Hardiness Zones (PHZ), Precipitation classifications (PCP), Global Ecoregions (GEZ). Each of these is a worldwide dataset, and each has about 20 discrete integer values numbered 10 to 30ish.

The user selects multiple combination of these three factors, e.g. these two:
  • Zone 7 (PHZ=7)
  • Rainfall 20-30 inches per year (PCP=3)
  • Temperate oceanic forest (GEZ=31)

  • Zone 7 (PHZ=7)
  • Rainfall 20-30 inches per year (PCP=3)
  • Temperate continental forest (GEZ=32)
The user would then see any areas which match any of these "criteria trios" highlighted on the map. The idea is that a plant that's invasive in these areas, would also be invasive in the other areas highlighted on the map.

Fortunately, the rasters are not intended to be visible and do not need to be visually pleasing. They need to be either colored (the pixel matches any of the trios) or else transparent (not a match).

Raster To Tiles To Leaflet


Three variables and a need for one raster... sounds like we could have a three-band raster! I used ArcMap's Composite Bands tool to merge the three rasters into a three-band raster. Voila, one TIFF with three-part pixels such as (7, 3, 31) and (7, 3, 32) I just have to keep straight that R is PHZ, G is PCP, and B is GEZ.

Second, I needed to slice this up into map tiles. But it's very important that:
  • The tiles be in a format that preserves R, G, and B exactly. Something GIF or JPEG would be entirely unsuitable since GIF picks a 256-color palette, and JPEG is lossy and fudges together colors. TIFF on the other hand is not viewable in browsers as a map tile. But PNG is just perfect: RGB by default, and viewable in browsers.
  • The tile-slicing process (I picked gdal2tiles.py) must also preserve exact RGB values. The default average resampler uses interpolation, so a pixel in between a PCP=3 and PCP=2 would get PCP=2.5 and that's no good! Use the nearest neighbor resampling algorithm, which guarantees to use an existing pixel value.
Slicing up the TIFF into web-read PNGs was pretty straightforward:
gdal2tiles.py -z 0-7 -r near -t_srs epsg:3857 threeband.tif tiles

Point Leaflet at it, and I get some nearly-black tiles loading and displaying just as I expected. Well, after I remembered that gdal2tiles.py generates tiles in TMS numbering scheme, so I had to set tms:true in my L.TileLayer constructor.

I downloaded a few of the tiles and opened them up in GIMP and sure enough, that shade of black is actually (7,3,31) The tiles are not meant to be visually attractive... but those preserved RGB codes are beautiful to me.


Client-Side Pixel Detection and Swapping


That's the topic of my next posting: now that I have RGB tiles, how can I intercept them, compare them against a set of RGB codes, and appropriately color/clear pixels...? Turns out it was easier and more elegant than I had imagined.

Saturday, May 14, 2016

In-Browser filtering of raster pixels in Leaflet, Part 1

Last week I was asked to do something I haven't done before:

A client has a raster of values, this being World Plant Hardiness Zones. Each pixel is a code number, which ranges from 1 to 13. If you're into gardening, you've seen the US variation on this on the back of seed packets.

The client wants the user to select one or more hardiness zones, and have the appropriate pixels highlighted on the map. For reasons I'll cover in a later blog post, this needs to be done using the raster's RGB codes and not through some of the simpler techniques such as GeoJSON polygons with varying opacity.

I ended up writing this Leaflet TileLayer extension. It takes a list of RGB codes, and then rewrites the tile images in-canvas so as to have only two colors: those which matched and those which did not.
http://greeninfo-network.github.io/L.TileLayer.PixelFilter/
https://github.com/GreenInfo-Network/L.TileLayer.PixelFilter/

And there's our basic need filled: near-instant pixel-filtering into "selected" and "non-selected" categories, with appropriate color fill.

If you're in a hurry, there's the link to the demo and to the library. But the technical process involved, and the application of it for the client turned out to be very interesting indeed, so they will form my next few postings.

Friday, April 29, 2016

Heroku and Wordpress: Afterthoughts

The last two postings were about my successful and happy migration of my personal Wordpress site from my home PC to Heroku. It went beautifully, but a little testing and a few days revealed some improvements. One could call these Pitfalls and Gotchas about Heroku, lessons for my next app.



Database Usage / Wordpress Revisions


My MySQL dump file was 2.3 MB, fitting quite well under the 5 MB mark. Wrong. After being loaded, it bloated to about 4.1 MB and I got a warning email from Heroku that I was nearing my capacity.

First off, thank you Heroku and ClearDB. Nice job on the notification before this became a problem.

Second, what's using my disk space? I did some queries (thank you, Stackoverflow) and found that wp_posts had 450 rows... for my 35 pages. Ah yes: Page Revisions. Wordpress keeps copies of every time I've clicked Update, so there are hundreds of extra pages in there.

I trawled the net a little bit for two pieces of advice that set this right:
  • Disabling revisions (or limiting them to 1 or 2 revisions, if you prefer), with a wp-config.php modification. Easy.
  • A plugin to clean out old revisions. I then uninstalled it since disabling revisions means there won't be more to clean up.
Now my table was down to 150 posts, which is right for 30ish pages and 120ish images. My dump file was under half the previous size.

To finish the job, I had to take a dump of the table and then reload it, so as to drop and recreate the table. (databases do that, keep the space for the future; it's normally a good idea, but not here and today) And now I'm down to 1.6 MB, one third of my free tier.

Sending Email


Heroku doesn't have a sendmail setup, not even sure they have a SMTP service. So Wordpress's email capabilities were not working.

Your basic options here are:
  • Sign up for SendGrid, since Heroku has an addon for that. They have a free tier of 12,000 messages per month, which is really more than my blog could ever generate.
  • Supply your own SMTP server.

I went for the latter since I use GMail and can definitely navigate the steps of a Project, the API keys, and the GMail SMTP plugin for Wordpress. The plugin's own instructions were somewhat outdated, but that wasn't a problem. And voila, Wordpress sending email.

For your own programming, you would want to use PHPMailer or some such so you can specify your SMTP settings. A lot of things presume that there's a local sendmail or local SMTP, but not here.



So yeah, that was it. Not too awful in the "gotchas" department. Not at all! I am suitably impressed with Heroku at virtually every level: price, performance, ease of deployment once you wrap your head around it, and notifications.


Wednesday, April 27, 2016

Heroku and Wordpress: The Migration Process

My post day-before-yesterday begun the story of my migration from a home-hosted website to Heroku. Here's part two: The Migration Process

Step: Move my Media Library content into S3


First off, the uploaded photos should not be under version control; so they shouldn't be in the repository and in the ephemeral filesystem.

This was a tedious process of a few hours since I had about 130 images.
  • I set up the S3 plugin I mentioned above, made a few tests and confirmed that it's sweet.
  • I SFTP'd into my Wordpress site's wp-content/uploads folder and grabbed everything.
  • Then deleted everything from my Media Library,
  • Then uploaded it all again and watched it load into S3 and leave my uploads folder empty.
  • Then went through every posting and replaced all of the images, which of course were now broken. Annoying, but fortunately I only had 35 pages with images and did it in about 2 hours.

Step: Init repo


I'm a fan of Gitlab. They offer unlimited private repositories for free, which is really excellent. I created the repository, then followed their simple instructions to load my Wordpress files into the repo and basically turn my site into a clone.

I also created a .gitignore file with these two entries for some folders I'll be creating in a little bit. The private is where I'll do some other work I don't want in version control, e.g. working files and database dumps that I want to keep close. The vendor would be generated by composer later on, trust me.
/private/
/vendor/

Step: Add Heroku as a Secondary Master


The trick in allowing git push to push to deploy to Heroku, is that you tell your repo clone to use your Heroku as a secondary master.

heroku git:remote -a your-server-name

As of now, when pushing you will need to distinguish between git push origin master and git push heroku master. One pushes into your git repository (Gitlab, Github) and the other would redeploy to Heroku.

Step: Wordpress Updates


I then noticed that some updates were available for some plugins and for Wordpress itself. So I ran those, and after each one noted that git status reported exactly what I would expect from each upgrade. So three commits later I had Wordpress and plugins all updated, with commit notes for each update. (I could have done this before the repo init, but why not do it under version control?)

Step: Add a Procfile and composer.json file


For Heroku compatibility, it's advisable to add a Procfile and a composer.json and composer.lock file, to indicate to heroku what PHP version to prefer, that you prefer Apache over Nginx, etc. You will want these in version control.

Procfile
web: vendor/bin/heroku-php-apache2

composer.json
{
  "require" : {
    "php": "^5.6.0"
  },
  "require-dev": {
    "heroku/heroku-buildpack-php": "*"
  }
}


composer.lock is generated from the composer.json with a command:
composer update --ignore-platform-reqs

Step: Push to Heroku


All set? Then here goes:
git push heroku master
And I visit my website. And it's a Wordpress error that it can't make the database connection. That's to be expected: my wp-config.php has the old credentials and I still need to upload the database content. But that's definitely Wordpress making the error, so a fine start.

Step: Database Configuration


First step was to scrub my database credentials from the wp-config.php file. Yeah, dummy move to forget to do that, but the credentials are wrong for Heroku so are useless, and my local MySQL is going away in an hour anyway. But yes... don't do what I did. ;)

When I scrubbed the database credentials, I replaced them with $_ENV variables like this:
define('DB_NAME', $_ENV['DATABASE_BASE']);
define('DB_USER', $_ENV['DATABASE_USER']);
define('DB_PASSWORD', $_ENV['DATABASE_PASS']);
define('DB_HOST', $_ENV['DATABASE_HOST']);
Add, commit, push. Site is still dead but it's ready for this next trick.


Run heroku config and it dumps the app's configuration variables back to me. I tease apart the database URL string, into my set of 4 environment variables for the 4 $_ENV items above.
heroku config:set DATABASE_BASE='heroku_XXX' DATABASE_USER='XXX' DATABASE_PASS='XXX' DATABASE_HOST='XXX'
My app/dyno reboots and... the site's up!

Step: Database Data


A simple mysqldump was all it took to take a backup of my database, then loading it was one more command.

mysqldump -h localhost -u olduser -p olddbname > private/dbdump.sql
mysql -h herokuhost -u herokuuser -p herokudbname < private/dbdump.sql

It doesn't get as lot easier than this.

And now the site is up! My data, on a new database on a new Heroku server. Shiny.


Step: Custom Domain


My site is www.fightingfantasyfan.info and not fightingfantasyfan.herokuapp.com Heroku calls this a custom domain. There are two steps in setting up the Heroku app to work properly with a custom domain.

  • I hit up Heroku's dashboard and the Settings for my site, and added two domains for it: www.fightingfantasyfan.info and fightingfantasyfan.info This allows it to respond to these alternate hostnames, should they point to this app.
  • I went to my domain registrar's domain control panel and set up a CNAME record, so that www.fightingfantasyfan.info is equivalent to fightingfantasyfan.herokuapp.com This took a little bit to propagate, but was done about the time I finished my cup of coffee.


And that really was it. Well, almost. More on this tomorrow.


Monday, April 25, 2016

Heroku and Wordpress

This is a bit off-topic for The Map Guy since it has nothing to do with maps, directly. But it's interesting and it's about the cloud, and about software deployment. It's about... Hosting a low-volume personal Wordpress website on Heroku for free.

Introduction


I have a couple of personal websites for hobbies. I'm almost ashamed to admit that I'm using Wordpress for them, being the hardcore, bad-ass hacker that I am... but in the evenings and with nothing at stake, the convenience of Wordpress works for me.

I had been hosting my site on a Raspberry Pi at home, but the reliability just wasn't there. The Pi would crash from time to time and require rebooting. After the second time it did this while I was away on a trip, I decided it was time for a change. Even better, this is an opportunity to say Hello To The Cloud with a real-world application with no stakes involved.

Enter Heroku, my hero.

Heroku is a service that spins up virtual machines (they call them "apps"), using as their filesystem your git repository. The basic idea is this:
  • You have a git repository which contains your website: HTML, PHP, Python, Ruby, etc.
  • You create an "app" via their dashboard, also called a "dyno", and it has an URL. You then attach "Add-Ons" such as a MySQL database, a PostgreSQL database, SMTP via SendGrid, and other such services that you'll need.
  • Using Heroku's command-line "toolbelt" you add to your clone of the repository, the fact that this clone is "connected" to that app/VPS. So your clone has two masters: git push origin master for saving your code to version control as usual, and git push heroku master to deploy it to the site.
  • When you push to heroku master, Heroku resets your VPS, loading up its filesystem from your repository content. Assuming that your code works, the site works too.
  • No permanent storage ("ephemeral filesystem"). When the VPS is load-balanced to another system, or put to rest, or you push again, it's reset from your repo. This is very important if your web app will be making filesystem modifications such as accepting uploads. More on this later.
So the effect is a build-from-components server, enabling the specific services you need.

For small personal sites, free tiers for the dyno and for the add-on services may prove sufficient to host the Wordpress code and the database entirely for free. The limitations of this free tier, as relevant to my personal Wordpress site, are:
  • The dyno (the VPS itself) will go to sleep if there's no activity, and the next incoming activity to wake it up does mean some delay. The dyno must be asleep at least 6 hours per day, so using Uptime Robot to keep it awake is a no-no. The next step up to get rid of the sleeping, is only $10 per month.
  • The ClearDB MySQL database is free only to 5 MB. This may not be enough for a long-running daily blog, but for a few postings per month maybe it's just what you need. The next step up is only $10 per month for 1 GB.

Since Heroku does support PHP, and a mysqldump of my database is only 2 MB in size, this sounds right on target for a Heroku freebie.


Ephemeral Filesystem


Let me reiterate that "permanent storage" comment above. Your Heroku filesystem is a git repository. Your dyno does have the ability to write files on disk,  e.g. an uploads folder, e.g. the Media Library component of Wordpress. But you won't like it. When your dyno resets, the filesystem is reinitialized and your modifications would be lost.

By "reset" is meant a reboot, it being load-balanced onto a new server, it going to sleep and waking back up, your next git push, or using heroku config:set to change environment variables. Bye bye, uploaded files.

The up side of this is that a hack of your website, assuming it didn't damage the database, can be solved by rebooting. The down side is that you need to find someplace else for long-term storage of any web-supplied files not in your repository (or else make a practice of manually downloading the files and adding to the repo? sounds awful)

In the case of Wordpress the easiest solution I found is the WP Offload S3 Lite plugin. When you upload to the Media Library, it goes to Amazon S3 instead, and media URLs are rewritten to point to their S3 version. Between Amazon's generous 5 GB free tier for a year and the real price being a paltry 3 pence per gigabyte per month, even an image-heavy website can get by on pocket change per month.

If you are writing your own ware, you'd want to code for your cloud storage of choice such as Amazon S3, Google Drive API, Dropbox API, etc. where you supply a file and get back a URL. I imagine you'd need to generate API keys, program the OAuth-style exchange of them, handle errors etc. and that sounds awful. For my own case here, though, the folks at Delicious Brains had generously done that heavy lifting for me.


Happy Ending


Spoiler time: My applications are all running on Heroku and the performance is quite acceptable.  And I learned a lot in the process. The rest of this story is how I got to this happy place.

So, now that I've covered the basics, tomorrow's story will be about the migration process!

Monday, February 15, 2016

Some Leaflet controls for modal dialogs

A recurring need we have, is to open a dialog or a modal from within a Leaflet map. Dialogs and modals are a great way to add Feedback links, to show the legend without taking up screen space, to add a Help or About panel, etc.

A perfectly ordinary way would be a positioned DIV anchored to the edge of the screen or to the edge of the map DIV, so it looks like a "tab". You've surely seen this before for Feedback links along the edge of the screen. But this time we wanted a button in-map, something that looked like a real Leaflet button. It wasn't tough to do, cuz Leaflet rocks.

Now... having done it, why not polish it up so that next time it's copy-paste simple? Here we go:

https://github.com/gregallensworth/L.Control.BootstrapModal
https://github.com/gregallensworth/L.Control.jQueryDialog

I even took the step of making a jQuery UI version. The project last week didn't use this, but it was a small step to adapt the button code, and it will surely serve us in the future.

Enjoy!

Friday, January 8, 2016

IonicMapStarter, the Ionic successor to MobileMapStarter

At winter solstice 2012, I took a few days of vacation time and used it to create MobileMapStarter. The intent was to boil down the essential and generally-reusable bits of a mobile mapping app, strip out the application-specific stuff, and have a starting place for our future map apps. This meant:
  • A working mobile framework, with page-view management and all
  • Working around certain bugs and issues of Leaflet when used inside page-view systems
  • Working boilerplate code for geocoding, location tracking, etc.
  • The thing being designed with configuration separate from execution so it's simple to reconfigure
  • Ability to cache tiles for offline use
And it worked out famously. I presented MobileMapStarter at FOSS4G 2014 (welcome to Portland!) and was surprised at how popular it became.

But, jQuery Mobile has problems and I have been looking for a newer framework to replace it. Some weeks back I really got to like Ionic, and winter solstice 2015 has brought you...

IonicMapStarter


It's the same concept as before: a mobile app scaffold, that's easy to reconfigure and adapt to form your own mobile app. But this time it's built with Ionic. I mentioned a few weeks back some improvements which this change brought to ParkInfo Mobile:
  • Better performance all around, from the map to general page-loading and panning behavior
  • Cleaner structure from the ground up, and a reduction in code volume
  • Improved UI for the offline tile caching, as well as improved capabilities
We still aren't releasing ParkInfo Mobile until we finish some branding
 and functional tweaks, but you can start on your next-generation mobile app tonight.


Monday, January 4, 2016

JSHint + AngularJS = Tight as a drum

So, in the development of the new version of ParkInfo Mobile, I wanted to run things through some code-quality checks as a matter of course. I'm good at what I do, but having a second pair of eyes (cybereyes!) look for missing semicolons etc. sure won't hurt.

I went with JSHint and I really like the results.
  • JSHint noticed a few stragglers such as semicolons, trivial stuff that could lead to larger goofs. Making use of JavaScript statement-chains crossing lines, means that a stray semicolon or a missing semicolon cause truly bizarre malfunctions. For this reason I prefer not to use the multi-line syntax, but with Angular it does happen a lot so this extra check is nice.
  • JSHint also complains about undeclared variables, e.g. L which is declared in leaflet.js and angular which is declared in angular.js These weren't errors at all but are easily permanently silenced... and could have been invaluable if the "undeclared global" were actually a typo.
  • JSHint also reports unused variables. In most cases this was a callback that receives an error object, and we don't use the error object. But in a few cases it was dependency injections which were no longer in use. So a second use of JSHint was to check for unused dependencies and for erroneous undeclared dependencies. Very nice.
So, I threw together a quick-n-dirty shell scrip to run everything through JSHint, and I run it when I get ready to push.

Step 1, install JSHint via npm. You're used to this if you use Cordova:

npm install -g jshint
Step 2, set up this script. Heck, add it to your source archive:
#!/bin/sh

for js in www/index.js www/controllers/*.js ; do
    echo "********** CHECKING $js"
    echo ""

    jshint $js

    echo ""
done
It's dead simple, nothing special... but it does keep things a bit tighter, and makes for easy and automated checks for some common goofs.

Now, the other tool commonly used here is JSLint. But so far I'm not impressed with it. Reporting unused variables and potential typographical errors is great, but JSLint reports truly stupid stuff such as having a space after the colon in an object literal, stuff which truly has no impact nor potential for impact. Maybe later, but not today.