Getting started 3D Printing for a developer

Over the past 20 years, I have on numerous occasions grabbed the latest demo copy (for some definition of “demo”) of AutoCAD or SolidWorks and sat down with it, determined to learn how to make things. For many years I have been aware of various CAM packages designed to help turn 3D models into tool paths for CNC machines, and more recently slicing software for turning 3D models into printable models. It seemed to me that a person could go out and buy a CNC mill or lathe and the requisite accessories, one could go buy CAM software, and then one could start making parts. Sure, you would probably send quite a bit of raw material and tooling to the recycler along the way in a teach-yourself-machining scenario, but you could start building stuff — especially simple things — almost immediately. Except…  well, you needed a 3D model at the start of the whole process. The more you know the more you realize you don’t know. Obviously the “if I have a 3D model I can run!” concept is a gross oversimplification of every step, and disregards what one could spend an entire lifetime becoming really good at. When I first started thinking along these lines I was a teenager, optimistic and all-knowing.

AutoCAD 2004 For Dummies
AutoCAD 2004 For Dummies

At some point, this gem made it into my collection and has faithfully gathered dust next to a slew of programming reference manuals. The “For Dummies” book that I wasted $24.99 on at Borders in Rapid City, SD as a 21-year-old provided as evidence that when I say I have been wanting to CAD for 20 years — I’m not completely full of shit. Just mostly. There’s a point here: it DOES start with a model. You can prototype 1-off things by hand all day, but if you want the benefit of computer aided manufacture, whether you are adding material or subtracting it, you need to be able to model the resulting part. Your model may look a bit different based on the manufacturing method, but it sure as heck won’t look at all if you can’t build it. So this is the hard part, the mental gap, the crux.

It’s a whole new world now. When I buy reference manuals I find I don’t actually have time to read them cover to cover and absorb the knowledge, just use them for reference.  I’m busy doing stuff, paying bills, being an adult. When did that happen? So I know myself well enough to know that if I want to learn something new, and fit it in, buying a book doesn’t cut it, I need a practical project to apply it to.

I have a few of these Milwaukee batteries around, and they are pretty good stuff. 54 Watt-hours in this specimen, 72 Watt-hours in another, that’s enough lithium stored ion fairy magic to do something serious. By comparison, my previous generation DeWalt power tool batteries were only in the 20-30 Watt-hour (Wh) ballpark.

There are lots of applications where I could use a good battery, bike lights, powering small electric DC motors, recharging my cell phone after too many hours watching YouTube videos, jump starting my motorcycle(s) every spring because I forgot to plug them into the battery maintainers…  Lots! Milwaukee is obviously hip to this concept, as they sell a “Power Port” with a built-in inverter that snaps onto one of these M18 batteries and supplies a 5V USB port. There are a damn lot of cell phone charges in a 72Wh battery. Typical high capacity smartphone batteries are in the 10Wh ballpark. See also: Why Watt-hours

A few years ago I did a fair bit of work for a Rapid City company named B9Creations. They sell a 3D printer very popular among jewelers. As a result of that work, I learned enough about the competing 3D printing technologies to make what I felt would be an informed purchase decision, the ability to realistically weigh initial purchase price, consumable price, time spent on maintenance and configuration, print speed, build volume, etc. B9 sells a great product and if you can reasonably incorporate 3D printing into your job then you should probably consider buying one. SLA is rad. In my case, any 3D printer is going to sit next to my $49 InkJet (Ok, ok, $299 LaserJet) and very likely not get used at all once the novelty wears off. How much is the novelty worth? Not. Very. Much. That LaserJet is a bit of a novelty, the $49 InkJet would have been OK — so I guess if we do the math, the novelty is worth less than $300. My standards weren’t exactly that low, though, 100 micron build resolution? Yeah, that seems like a MINIMUM standard. 100mm/sec build speed? Again, minimal. The idea of it taking a full 24+ hours to build large/complex parts makes me roll my eyes and lose interest in a hurry.

Six months ago or so I stumbled across the Monoprice line of 3D printers. If you are unfamiliar with Monoprice, their business model is interesting and seems to fill a niche. They buy a bunch of something in a common specification, sell it at a very good price, and provide a reasonable level of service/support. So 1 option, not 37. No fancy packaging. Not straight from overseas so I can easily return it if I need to. This works for me and is a niche Amazon is also seeking to fill with products branded Amazon Basics.

For the last six months or so, I have had my eye on their V2 Mini printer, which comes in around the $200 mark, and has favorable reviews. My biggest concern was with the build volume, it’s also pretty slow. But, but.. $200! Fast forward six months, I noticed the “Maker Select Plus” model, which retails for $399 was available as a refurbished unit for a reasonable ($50) discount. I have always had good luck with refurbished equipment and am not afraid to give it a go so long as there is a reasonable length of time where I am able to return it. After deciding I could probably, maybe, possibly… justify this .. I noticed they had an “open box” version of the same model, which ostensibly is a unit that had been returned but had not gone through the refurbishment process, for $299. This printer and many other popular low-cost printers are based on an open source design. All of the parts are exposed, replacement parts are easy to source, and any problems shouldn’t be rocket science to diagnose. Bottom line: I can fix it or return it if it comes out of the box broken. Ok — fine. Printer ordered for $299.99, $0.01 below my novelty purchase threshold.

There is not TOO much to say about the unboxing and assembly, you can find that review somewhere else I am sure, but I will detail what was notable. The printer I received had obviously run some material through the nozzle but the build platform did not look used. Overall it was well packaged and was easily assembled. My problems began almost immediately after assembly, though. In total, I ended up with several hours into debugging, which is no big deal and I am still happy with my purchase. It’s hard to put the issues I ran into in context, to a person familiar with 3D printers or other multi-axis CNC equipment they may have diagnosed and fixed similar issues in ten minutes. Most people probably would have returned the printer out of frustration. The printer auto-homes at startup to sort of figure out what’s what and where everything is. I thought nothing of this auto-homing procedure to start, as I assumed any limit values for the three axis’s, axes, plural axe, would be hard coded in firmware somewhere, or something. This was ostensibly confirmed as there was a setting for the start positions of x, y, and z accessible via the touch-screen menu. After startup, it was nearly impossible to calibrate the build platform to get it perfectly level. The machine was also slightly bent from the prior owner trying to get this calibration done — fortunately not bent in a way to compromises the machine in any way, just gives it character aesthetically. After dinking around for a couple of hours I got it mostly sorted out, or so I thought. Then I noticed the quality of the first few layers of my part seemed to vary if the machine had been turned off and back on in-between prints. The only difference that would make is if the auto-home process was affecting something. One thing I found weird about the machine from the start, was the auto-home for the Z axis would touch down at a spot off the build platform, it was touching down on the bolt head used to secure the heated platform down, not on the print surface. I did not know if this was right or wrong, it just seemed odd. As it turns out, it was wrong. Depending on where it touched down on this rounded bolt head, it would throw off the first layers, or not. Dead in the center of the bolt head and my adjustment held, just slightly off center and everything was a mess. It all suddenly made sense. The build platform was moving too far during the auto-home process, and the machine was setting up configuration values based on where it stopped in this process. I adjusted the Y axis limit switch so the build table would stop slightly further toward the front of the printer. This resulted in the extruder always touching down on the actual build surface during auto-home. Boom. Easy to calibrate. Calibration held every time. Off to the races. I haven’t verified this is how the printer actually works in reality, for what it’s worth, I’m mostly just pulling all of this out of my rear. Tell me I’m full of it, I’ll probably believe you.

So I’m printing test prints downloaded from the internet. That’s cool, except there are a few million models I can download from all manner of different websites and they are all utter rubbish. Toys, novelties, trinkets. Mostly just trash though, really. The only ready-to-print models I have found so far that I expect to be useful are replacement parts for the printer… (Yeah, OK, so there’s a Milwaukee battery mount model out there already, but that’s beside the point)

On to the project at hand! Using these Milwaukee batteries for stuff other than making power tools spin! Shit. I’m back to needing a 3D model.

I spent a few days playing with various pieces of software. Some free, some not. Fusion 360 seems pretty capable for the price. I can see why TinkerCAD is popular, it is in fact quite easy to pull off basic things. I watched a bunch of videos. Lots. So many videos. I tried quite a few different things. I started to get the hang of it. Started being the key word. I sort of started getting my brain wrapped around this concept that I could draw things, and then apply fixed constraints to different pieces to make my imprecise drawings precise. Still, it all felt quite imprecise and clumsy. At this point, I knew one of the primitive operations the CAD platforms are all capable of doing is “extruding” a 2D shape into a 3D object. I couldn’t help but think about how much easier it would be to programmatically generate a vector, then import it into the CAD package to extrude it into a 3D shape. After years of tackling problems with algorithms, it was more natural to think about solving things this way. In programming we are often conceptually thinking about three dimensions, often times multiple three-dimensional data structures in addition to how the data structures interact with one another. Conceptually, this 3D modeling thing was easy. There I sat, though, in a GUI-driven CAD package able to model in my mind exactly what I wanted to do and how I should be able to go about it and unable to express that through the GUI. Frustrated. Tired. The novelty was wearing off quickly.

I didn’t want to give up, not this time, so I started looking at how I could script CAD tools to do some of what I wanted, programmatically, while I continued to learn the interface. Yep, in retrospect I was basically a rat in the bottom of a flushing toilet gasping for air, I had a lawn to mow and a room to paint and laundry to do and bills to pay, there was no time for this point-and-clickery.

Then I stumbled on Blender and its ability to be scripted with Python, OpenSCAD, and most interesting to me, OpenJSCAD. I almost didn’t pursue learning any of these tools. It seemed like the GUI-based modeling software/method is what everyone used, and perhaps these tools were just fringe software stacks of questionable quality. I watched more videos, I spent more time pointing and clicking. It still wasn’t clicking, though. Not having printed anything of my own creation yet, I finally broke down and sat down with an OpenJSCAD instance in one tab, the documentation in another, and a Milwaukee battery and digital caliper in my lap. I started taking measurements, I started creating shapes. I started merging shapes and subtracting them from one another. I made a whole bunch of mistakes, the code was a mess, I put the battery clips in the wrong place, I used the wrong units. I had a model though.

I slept on it, made some tweaks, and printed it to validate whether or not I could even slide a battery in and out. I was completely ignoring the electrical connections at this point. It printed, my battery didn’t fit. Easy to see why not though, I did another iteration. Printed, the battery still didn’t fit — but darn close! Iteration three, after learning about hull operations and how to radius things and…  the model looks a lot better AND THE DAMN BATTERY FITS. I have printed 7 of these now, each with various tweaks that have been an improvement on the prior.

The most recent iteration (not shown) has electrical connectors and is fully functional. I have been quite impressed by the structural rigidity, strength, and light weight of the finished printed parts. You aren’t going to build passenger planes or rockets that can go to space with these parts, but for coming out of what is essentially a really precise hot glue gun they are amazing. Print #4 weighs 29 grams. It was printed with Monoprice PLA+ filament that was $19.99 for a 1kg spool. Material cost for this print? $0.58. Print time? About two hours. Would I trust it to hold onto my 1.5lb battery while it was repeatedly subjected to 3-4g’s? Yeah. Probably. It’s a functional part.

It has now been 1 month since I ordered the printer. OpenSCAD and OpenJSCAD have proven to be very rich environments for creating models. I find I am only limited by my ability to adequately express myself in code. This is the same limiting factor (mental leap) I found in GUI-based tools as well, but with the programmatic interface, I am able to bring my experience to bear on the problem. Sometimes my solutions lack elegance, but I can make it do what I want!

Developer? Want to model parts in 3D? Don’t do the clicky. Do what you do.

Tracking MachForm form submissions with Google Analytics

machform-trackMachForm (self hosted) is a great tool for managing many different types of user submissions from visitors to your website. While WordPress has a great form option in Gravity Forms, MachForm is platform agnostic and has a number of integration options allowing it to coexist fairly well with almost any LAMP-based web deployment.

Since version 4 MachForm has allowed for loading of a Custom Javascript File, configurable on a per-form basis. This provides an excellent facility to track form submissions in Google Analytics. These events can then be used to create goals, etc.

This is actually perhaps easier than it sounds, the first step is adding the Google Analytics embed code for the website to a file (assuming you are using the default iframe embed mode of MachForm), without the line for tracking a pageview. Since MachForm uses jQuery internally, we can use jQuery here to attach events to the form that will submit our Google Analytics event when the form is submitted. The portion of the code that extracts the title of the form may be different depending on the MachForm version, MachForm theme chose, or etc.

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
        (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
    m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');

ga('create', 'UA-XXXXXXXXX-1', 'auto');

$(document).ready(function(){
    $('#submit_form').click(function(e){
        var title = $('#form_container > h1 > a').html();
        ga('send', 'event', 'form', 'submit', title);

        var form = this.closest('form');
        e.preventDefault();
        setTimeout(function(){
            form.submit();
        }, 500);
    });
});

Once this JavaScript is saved to a file and uploaded to your server, add the path to under Advanced Options for all the forms you wish to track and you are off to the races.

Magento Integer based SQL injection vulnerability in product parameter

Recently I was asked to look into a potential PCI compliance issue in Magento 1.7/1.8/1.9. The potential issue was uncovered by ControlScan. The summary was as follows:

Integer based SQL injection vulnerability in product parameter to /checkout/cart/add/uenc/<snip>,/product/<id>/
 
Risk: High (3)
Port: 80/tcp
Protocol: tcp
Threat ID: web_prog_sql_integer

Upon diving into the additional supplied information, it was almost immediately clear what the test was doing. It was performing a POST request against the URL: /checkout/cart/add/uenc/<snip>,/product/XYZ/
XYZ translates to a valid Magento product id. In the payload (POST’d multipart/form-data) that would get parsed into the PHP $_POST superglobal, an initial request passed product=XYZ, and a subsequent request passed product=XYZ-2.

The scan saw the same output returned for each request, and thus assumed the cart might be getting “duped” by the invalid XYZ-2.

Let’s take a look at the code which handles this submission (which is an AJAX style action that adds a product to the cart). It is located in app/code/core/Mage/Checkout/controllers/CartController.php, starting around line 170, in the addAction public method. The take-away here is the $params variable setup in addAction, as well as the product id discovery in _initProduct both retrieve their data by calling $this->getRequest()->getParams(); — this parameter data comes from any number of places, including the URL, GET, or POST. In this instance, the product variable is being parsed out of the URL, and the product supplied via POST is never referenced. No wonder the output was the same, the URL was the same in both cases, the modified POST data was never a factor.

If you simply want to tighten up your cart to get it to pass your PCI compliance scan, the following code will do that for you, just replace the top part of addAction with the following, and be prepared for an eventual upgrade to undo this patch.

public function addAction()
{
    $cart   = $this->_getCart();
    $params = $this->getRequest()->getParams();

    $postInput = file_get_contents("php://input");
    $postStrDataArr = explode("\n", $postInput);
    $postStrData = array_pop($postStrDataArr);
    parse_str($postStrData, $postData);

        if ((isset($postData['product']) && $postData['product'] != $params['product']) || !is_numeric($params['product']))
        throw new Exception('Invalid Product ID');

    try {

This modification compares the parameter parsed via the URL with the parameter passed via POST and throws an Exception if the two do not match.

No doubt there is a better and more Magento-esque way to remedy this issue, but the above will work in a pinch.

Real world effects of changing rel canonical link element

In 2009, Google introduced a method website owners could use to disambiguate duplicate content. By specifying a rel=canonical link element in the header of the page you give the search engine a hint as to the URL which should be authoritative for the given content. It should be noted Google has indicated they consider this method a hint, and not a directive. The conditions under which the hint will be ignored are not known, but such conditions are presumed to exist.

Imagine a simple example, anyone who has purchased a home or property in the US is reasonably familiar with the Multiple Listing System (MLS). Real estate agents add properties to the MLS and the exact same information shows up on the website of every agent and agency. How does Google know which website(s) are authoritative for this information if it is the same on potentially thousands of websites? This is a contrived example of a real-world problem, and implementing a strategy around canonical link elements can help to ensure people end up where you want them to be. One strategy might be to get visitors to the website of the agency, rather than the individual agents.

That information is all well and good, in theory, but how does it actually work in practice?

A tale of two websites…

Recently there was a case where a series of several dozen guest blogs on an established website needed to be moved, removed, or somehow re-incorporated into the overall strategy. The established site and its mission had grown and changed, meanwhile, the blog series in question had grown less relevant to the overall goals of the site. But it was still good content that many people accessed and used as a reference!

It was decided the content wasn’t “hurting” anything, and could remain, but would be inaccessible via primary navigation routes and should over the long term be given a new home. The original author of the blogs was willing to give the content a new, permanent, home on his own personal site. The authors site did not yet exist, had no existing inbound links, and zero authority with search engines — a blank slate!

Each blog post in question was re-posted on this new website, several dozen posts in total, a handful of which receive a reasonable amount of search engine traffic. The canonical links for the articles on the established site were then changed to reference these new pages on the formerly empty domain.

Google quickly adapted to the new “home address” of these pages, and within a matter of days, the new domain was seeing all the search engine impressions for these articles. After this quick adjustment over a period of a few days, the pattern held over the following month.

In the following graphic, a screenshot from the Google Search Console, you can clearly see the number of search engine impressions served by Google quickly ramped from 0 to in the neighborhood of 50 impressions per day.

snip-search-console-canonical-change

Here you can see the same data, over a slightly longer period, from the established site. The “new” site neatly stripped away around 10% of the organic search engine traffic from the established site.

snip-search-console-canonical-change-source

Most scenarios involving duplicate content management with the rel=canonical link element aren’t going to exactly match this one, so please take these results with a grain of salt. That said, it does clearly show the cause, effect, and timing surrounding changing the canonical links for established pages. It also clearly shows that Google pays attention to these canonical elements and can take fairly swift action on them.

New Sturgis Area Resource: Visit Sturgis

A new resource for visitors visiting the Sturgis area recently launched, it is a website called Visit Sturgis. The Visit Sturgis website contains a wealth of information visitors would have a hard time uncovering in a short trip without a local guide. Not only will the new website be useful as a planning tool for those who will be visiting Sturgis as a pre-planned destination, but also for ad-hoc visits. Sturgis sits directly in the heavily trafficked Interstate 90 corridor and as a result sees many thousands of visitors every year who are simply passing through.

The resources and information on the Visit Sturgis website will cover the usual suspects, such as information about lodging, restaurants, and local businesses of interest. The website will also contain a great diversity of information about local events, and little known avenues for recreation.

Sturgis plays host to many events in addition to the well known Sturgis Motorcycle Rally, these include the Sturgis Camaro Rally, Sturgis Mustang Rally, Tatanka Mountain Bike Race, Sturgis Gran Fondo, and many more.

There are also many miles of non-motorized single track trails accessible directly from town that cater to mountain bikers, hikers, trail runners, dog walkers, and horseback riders. See the recreation information on the Visit Sturgis website for more information about accessing these trails.

While this website may have been recently launched, it already contains valuable information that can not be found elsewhere. Expect it to continue to grow into and every more informative resource as time progresses.

New website

The new Black Hills Trails site is now live, although a bit sparse at the moment. Significant changes and additions are expecting in the coming weeks and months.

The maps and mapping functionality of PahaSapaTrails.com has been given to the Black Hills Trails organization and incorporated into this new site. As the maps and mapping functionality are extended and enhanced the PahaSapaTrails.com website will eventually be phased out altogether.

Generating aerial tiles from NAIP imagery

PahaSapaTrails.com is working on a mobile (iPhone/Android) trail mapping application for the Black Hills area and one of the chief considerations is making it work offline. As any local can tell you, cellular service can be spotty in the Hills, even in town in some cases! The plan is to let folks optionally download the map data/tiles directly to their device using the MBTiles format.

So you need to create your own aerial imagery tiles for a web slippy map project do you? Before you dive too far down this rabbit hole, take a look at the MapQuest Open Aerial tiles, they are available for use under very liberal terms and are good quality. I was only unable to use them for this project because not all tiles were available at all zoom levels that I needed for my area.

Getting and processing the data

Almost every year the NAIP (National Agriculture Imagery Program) captures imagery of most of the country during the growing season. This high quality aerial imagery of the United States (the same imagery used by Google and other web mapping providers) is available for free download from the USDA Geospatial Data Gateway. It may be available for order in other formats, but the only option available for download in my area is an ESRI Shapefile / MrSID format. The MrSID format is a typically lossy image format designed for very high resolution imagery. Unfortunately I have not found many good inexpensive tools for working with MrSID files, so the first step in this process is converting to a format that is easier to deal with in terms of software support, in this case GeoTIFF. The GeoExpress Command Line Utilities published by LizardTech, available for free download at the time of this writing, are able to do this extraction for us with the following command:

mrsidgeodecode -wf -i Crook_2012/ortho_1-1_1n_s_wy011_2012_1.sid -o Crook_2012.tiff

In this example I am using imagery for Crook County, Wyoming. The -wf (world format) option to mrsidgeodecode seems to be important, it tells it to create a geo-referenced tiff file.

Now that we have our imagery in the GeoTIFF format we can use the open source GDAL/OGR command-line utilities to slice and dice the data. The following commands used from here on out: nearblack, ogrinfo, gdalwarp and gdaladdo all ship with the GDAL/OGR libraries.

The next hurdle is that this raster imagery always has a border of not-quite-black pixels that need to be pared off somehow prior to being able to use multiple adjacent images (counties in my case). If your target tiles exist within one county (or one MrSID file as downloaded from the USDA gateway) then you probably do not need to worry about this.

nearblack -nb 5 -setalpha -of GTiff -o Crook_2012_NB.tiff Crook_2012.tiff

The -nb 5 option in effect tells nearblack how aggressive to be, this seemed to work for me but your mileage may vary.

After trimming the edges we need to warp the GeoTIFF to our target projection. Basically all web mapping uses the same projection, EPSG:3857. In my instance I am creating tiles with TileMill and their documentation specifies that GeoTIFF’s should be in this projection. The only trick here is that you must supply the source projection, the GeoTIFF contains coordinate information but it lost its projection along the way. Use the ogrinfo utility to first get a list of layers available in the shapefile you downloaded from the USDA.

ogrinfo Crook_2012/ortho_1-1_1n_s_wy011_2012_1.shp
INFO: Open of `Crook_2012/ortho_1-1_1n_s_wy011_2012_1.shp'
      using driver `ESRI Shapefile' successful.
1: ortho_1-1_1n_s_wy011_2012_1 (Polygon)

Then, you will need to get the information about that layer to find the original projection.

ogrinfo Crook_2012/ortho_1-1_1n_s_wy011_2012_1.shp ortho_1-1_1n_s_wy011_2012_1
INFO: Open of `Crook_2012/ortho_1-1_1n_s_wy011_2012_1.shp'
      using driver `ESRI Shapefile' successful.

Layer name: ortho_1-1_1n_s_wy011_2012_1
Geometry: Polygon
Feature Count: 12
Extent: (489299.300000, 4885056.470000) - (580705.680000, 4990601.000000)
Layer SRS WKT:
PROJCS["NAD_1983_UTM_Zone_13N",
    GEOGCS["GCS_North_American_1983",
        DATUM["North_American_Datum_1983",
...

In this case it is “NAD_1983_UTM_Zone_13N”, you may have to Google around to find the corresponding EPSG number, in this case it is EPSG:26913. After all that, we can warp the GeoTIFF. The –config and -wm options here speed up gdalwarp by letting it use more RAM, you may want to play with these a bit to figure out what is fastest for you.

gdalwarp --config GDAL_CACHEMAX 300 -wm 300 -s_srs EPSG:26913 -t_srs EPSG:3857 -r bilinear -of GTiff -co TILED=yes Crook_2012_NB.tiff Crook_2012_NB_GoogleMercator.tiff

A person could at this point use gdal to merge multiple GeoTIFF’s together (if applicable) and then use the gdal2tiles script to generate tiles directly. In my case, my workflow already involves creating tiles with TileMill, so I opted for that route.

This next step is optional in theory but necessary in practice if you want to be able to preview the tiff files in TileMill or other imaging software. What this does is add smaller versions of the GeoTIFF to itself for lower zoom levels.

gdaladdo --config GDAL_CACHEMAX 300 -r cubic Crook_2012_NB_GoogleMercator.tiff 2 4 8 16 32 64 128 256

Creating tiles

After all of that is done you can load all of your GeoTIFF’s up into TileMill and see how they look. I give each TileMill layer a class called “geotiff” and use the following style.

.geotiff
{
  raster-opacity: 1;
  raster-scaling: lanczos; /* Best quality but slowest */
}

You can then export tiles using the standard TileMill process.

There are always of course extra considerations, such as output image size/quality. When generating map tiles of roadways and the like very often the PNG format is the best choice, but for our aerial imagery we want to use JPEG. Following are two tiles at 3 different quality / compression levels. From left to right, 65%, 75%, 85%.

2013_03_08_mbtile_1_652013_03_08_mbtile_1_752013_03_08_mbtile_1_85
2013_03_08_mbtile_2_652013_03_08_mbtile_2_852013_03_08_mbtile_2_75

Here is what the resulting file size was for the entire area at each of the three quality levels.

3570838528 Mar  7 15:55 aerial_65.mbtiles
4288478208 Mar  6 18:00 aerial_75.mbtiles
5750595584 Mar  6 05:42 aerial_85.mbtiles

Other considerations / Future improvements

Going through this process absolutely explodes the file size. The original NAIP imagery files for my working area are 33,743,711,602 bytes, or a little over 30GB. After converting to GeoTIFF and doing the processing mentioned above, the resulting size of the tiff’s is 952,212,023,220 bytes (closing in on 1TB). One way to greatly reduce this would be to use the JPEG-in-TIFF options that gdal provides.

My biggest complaint with where I am at with this now is that not all of the images are uniform with respect to color, brightness and contrast. gdal provides some options that could be used to adjust these from the command-line, but it would be a very manual process and may take a long time if many iterations are required. I may look at adding some image filters to mapnik (the mapping engine under TileMill) to enable specifying some simple Photoshop-style corrections.

Distances added to maps and descriptions

Distances (in miles) have been added to all of the maps and descriptions on the website. For trail networks without a specific route, such as the Victoria Network, the distance for each trail segment is listed at the end of the description of that segment. For trails that have a defined route, such as Victoria’s Secret,
Victoria 15, and the Victoria Lollipop, the beginning and ending mileage for each segment is listed at the start of the description. For these trails there are also balloons displayed on the map now indicating mileage traveled at certain waypoints.

Some bits from the archives

The sketch below was a concept done in late 2010 prior to the network/trail pages actually being implemented. While this website may be a modern version of a printed guide book, many facets of the site still have origins on paper.

2012_11_1_archive_bits