Spot an error? Know how to make this page better? I appreciate pull requests.

Generating this website

Literally, literate

This site is generated using Hakyll, a Haskell library for generating static websites. The raw version of this file (see the source link at the bottom of this post) is compiled into the executable that generates this entire site, and in turn is presented as this post. This is achieved by writing a literate source file.

Assumptions

This is not intended as a beginner tutorial, so some working working knowledge of Haskell is assumed going forward. As with all Haskell files, we have to import some modules.

First we import the Hakyll module, which contains the core functionality needed to generate this site. Next import is Hakyll.Web.Sass which provides a compiler for scss.

{-# LANGUAGE OverloadedStrings #-}
import           Hakyll
import           Hakyll.Web.Sass  (sassCompiler)

Next is the Monad module, which facilitates composition. If not familiar with function programming concepts, it is worth looking into.

import           Control.Monad    (msum, forM)
import           Data.Monoid      ((<>))

The Data and System modules provide many of the common functions you would expect in a programming language.

import           Data.Char        (toLower)
import           Data.List        (sortBy, intercalate, isInfixOf)
import           Data.Maybe       (fromMaybe)
import           Data.Ord         (comparing)
import           Data.Time.Clock  (UTCTime (..))
import           Data.Time.Format (defaultTimeLocale, formatTime, parseTimeM)
import           System.FilePath  (replaceBaseName, splitFileName, takeBaseName, takeDirectory, (</>))
import           System.Process   (readProcess)

Finally we import some UTF8 encoding to keep things consistent.

import           GHC.IO.Encoding  (setLocaleEncoding, setFileSystemEncoding, setForeignEncoding, utf8)

Now that we have everything imported we can set some configuration. The providerDirectory is where the built executable will look for source files when building the site. By putting all the files in a sub folder we can keep root folder clean.


config :: Configuration
config = defaultConfiguration
    {
      providerDirectory = "src/provider"
    , storeDirectory    = "src/_cache"
    , tmpDirectory      = "src/_cache/tmp"
    , inMemoryCache     = True
    }

Here we define the patterns we use to find content

A ‘post’ is an article that is ready to be published and included in the sites feed and listing. Grab anything, no matter how deep in the post directory, regardless of the file extension.
postsPattern :: Pattern
postsPattern = "posts/**"

A ‘draft’ is an article I am working on, and want rendered, but to not include in any listings. It can be accessed by providing the direct URL to the page. This is useful if I want to provide someone the ability to proof the post before going live. Grab anything, no matter how deep in the drafts directory, regardless of the file extension.

draftsPattern :: Pattern
draftsPattern = "drafts/**"

notes whatever, so I can organize as I see fit.

notesPattern :: Pattern
notesPattern = "notes/**"

Instantiate hakyll with UTF-8 encoding the above configuration.

main :: IO ()
main = do
  setLocaleEncoding utf8
  setFileSystemEncoding utf8
  setForeignEncoding utf8
  hakyllWith config $ do

First we need to render our templates so we can apply our content onto them.

    match "templates/*" $ compile templateBodyCompiler

Each of the following lines generates a portion of our content. More detail will follow when we look at the function definition.

    staticCss
    scss
    staticAssets
    index
    pages
    notesIndex
    postsIndex
    postsAndNotes
    secrets
    secretsStatic

I am using normalize.css, along with a syntax.css file, these get copied directly to the output directory.

staticCss :: Rules ()
staticCss = match "css/*.css" $ do
      route idRoute
      compile copyFileCompiler

todo: hakyll-sass

scss :: Rules ()
scss = match "css/*.scss" $ do
      route $ setExtension "css"
      let compressCssItem = fmap compressCss
      compile (compressCssItem <$> sassCompiler)

The following files are just copied verbatim due to the idRoute and copyFileCompiler combination.

staticAssets :: Rules ()
staticAssets = match "static/**" $ do
      route $ gsubRoute "static/" (const "")
      compile copyFileCompiler

index has some unique things. Lists of most recent posts, and list of most recently updated notes

todo: context todo: relativizeUrls todo: removeIndexHtml

index :: Rules ()
index = match "index.html" $ do
        route idRoute
        compile $ do
            posts <- fmap (take 3) . recentlyCreatedFirst =<< loadAll postsPattern
            notes <- fmap (take 3) . recentlyUpdatedFirst =<< loadAll notesPattern
            let indexCtx =
                    listField "posts" siteContext (return posts) <>
                    listField "notes" siteContext (return notes) <>
                    constField "title" "Home"                <>
                    siteContext
            getResourceBody
                >>= applyAsTemplate indexCtx
                >>= loadAndApplyTemplate "templates/site.html" indexCtx
                >>= relativizeUrls
                >>= withItemBody removeIndexHtml

Next we build the static pages. They live in the subfolder pages, strip that from the url. compile the markdown with pandoc. Drop them into the site template

pages :: Rules ()
pages = match "pages/*" $ do
        route $  subFolderRoute `composeRoutes` gsubRoute "pages/" (const "")
        compile $ pandocCompiler
            >>= loadAndApplyTemplate "templates/content.html" siteContext
            >>= loadAndApplyTemplate "templates/site.html" siteContext
            >>= relativizeUrls
            >>= withItemBody removeIndexHtml

Create the notes index page lexicography Ordered needs a title

notesIndex :: Rules ()
notesIndex = create ["notes.html"] $ do
      route subFolderRoute
      compile $ do
        orderedNotes <- lexicographyOrdered =<< loadAll notesPattern
        let ctx = constField "title" "Notes - Alphabetical" <>
                  listField "notes" siteContext (return orderedNotes) <>
                  siteContext
        makeItem ""
                >>= loadAndApplyTemplate "templates/notes.html" ctx
                >>= loadAndApplyTemplate "templates/site.html" ctx
                >>= relativizeUrls
                >>= withItemBody removeIndexHtml

The typical ‘archive’ page is just /posts, a list of all posts. One day, when I write a lot, I’ll need to figure out pagination

postsIndex :: Rules ()
postsIndex = create ["posts.html"] $ do
        route subFolderRoute
        compile $ do
            posts <- recentlyCreatedFirst =<< loadAll postsPattern
            let ctx = constField "title" "All Posts" <>
                    listField "posts" siteContext (return posts) <>
                    siteContext
            makeItem ""
                >>= loadAndApplyTemplate "templates/posts.html" ctx
                >>= loadAndApplyTemplate "templates/site.html" ctx
                >>= relativizeUrls
                >>= withItemBody removeIndexHtml

todo: posts and notes are treated the same. Just a logical separation .||. is logical OR for the match

postsAndNotes :: Rules ()
postsAndNotes = match (postsPattern .||. notesPattern .||. draftsPattern) $ do
        route $ metadataRoute titleFromMetadata `composeRoutes` myPostsRoute `composeRoutes`  subFolderRoute
        compile $ pandocCompiler
            >>= loadAndApplyTemplate "templates/content.html" siteContext
            >>= loadAndApplyTemplate "templates/site.html" siteContext
            >>= relativizeUrls
            >>= withItemBody removeIndexHtml

While I strive to keep everything open and transparent, some things are secret. These I store in a private git repo and clone it into a folder named secret under provider.

First we create the secret pages.

secrets :: Rules ()
secrets = match "secrets/*"  $ do
  route $ subFolderRoute `composeRoutes` gsubRoute "secrets/" (const "")
  compile $ pandocCompiler
    >>= loadAndApplyTemplate "templates/page.html" siteContext
    >>= loadAndApplyTemplate "templates/site.html" siteContext
    >>= relativizeUrls
    >>= withItemBody removeIndexHtml

As with the static above, just blindly copy everything in the secret folder to the site root. Allows me to host any files arbitrarily. todo: explain gsubRoute

secretsStatic :: Rules ()
secretsStatic = match "secrets/**" $ do
  route $ gsubRoute "secrets/" (const "")
  compile copyFileCompiler
removeIndexHtml :: String -> Compiler String
removeIndexHtml body = return $ withUrls removeIndexStr body
  where
    removeIndexStr url = case splitFileName url of
      (dir, "index.html") | isLocal dir   -> init dir
      _                                   -> url
    isLocal uri = not $ "://" `isInfixOf` uri

todo: contexts, what is it?

shortDateFormat :: String
shortDateFormat = "%B %e, %Y"

siteContext :: Context String
siteContext = mconcat
    [ dateFromMetadata "created" "createdDateTime" shortDateFormat
    , dateFromMetadata "updated" "updatedDateTime" shortDateFormat
    -- , gitHistoryUrl "gitHistoryUrl"
    -- , gitCommitUrl "gitCommitUrl"
    -- , gitSourceUrl "gitSourceUrl"
    , gitEditUrl "gitEditUrl"
    , defaultContext
    ]

todo: make the below two functions into one

dateFromMetadata :: String -> String -> String -> Context a
dateFromMetadata key value format = field value $ \i -> do
  t <- getMetadataField' (itemIdentifier i) key
  return $ formatTime defaultTimeLocale format (readTime t)

todo: git things

-- gitLog :: String -> String -> IO String
-- gitLog filePath format =
--   readProcess "git" [
--     "log"
--   , "-1"
--   , "HEAD"
--   , "--pretty=format:" ++ format
--   , "--"
--   , filePath
--   ] ""
-- 
-- gitBranch :: IO String
-- gitBranch = do
--   branch <-readProcess "git" [
--       "rev-parse"
--     , "--abbrev-ref"
--     , "HEAD"
--     ] ""
--   return $trim branch
-- 
-- 
-- gitHistoryUrl :: String -> Context String
-- gitHistoryUrl key = field key $ \item -> do
--   let fp = "provider/" ++ toFilePath (itemIdentifier item)
--   unsafeCompiler $ do
--     sha     <- gitLog fp "%h"
--     branch  <- gitBranch
--     let github  =  "https://github.com/kyleondy/kyleondy.com"
--         history = github ++ "/commits/" ++ branch ++ "/" ++ fp
--     return $ if null sha
--                then "Not Committed"
--                else history

The url to an items commit in GitHub

-- gitCommitUrl :: String -> Context String
-- gitCommitUrl key = field key $ \item -> do
--   let fp = "provider/" ++ toFilePath (itemIdentifier item)
--   unsafeCompiler $ do
--     sha     <- gitLog fp "%h"
--     let github  =  "https://github.com/kyleondy/kyleondy.com"
--         commit  = github ++ "/commit/" ++ sha
--     return $ if null sha
--                then "Not Committed"
--                else commit

The url to the item at the current time.

Todo: pull current branch out, so links where when I’m in local dev mode.

todo: combine functions

-- gitSourceUrl :: String -> Context String
-- gitSourceUrl key = field key $ \item -> do
--   let fp     = "provider/" ++ toFilePath (itemIdentifier item)
--   unsafeCompiler $ do
--       branch <- gitBranch
--       return $ "https://github.com/kyleondy/kyleondy.com/blob/" ++ branch ++"/" ++ fp
-- 
gitEditUrl :: String -> Context String
gitEditUrl key = field key $ \item -> do
  let fp     = "provider/" ++ toFilePath (itemIdentifier item)
      branch = "main"
  do
      return $ "https://github.com/kyleondy/kyleondy.com/edit/" ++ branch ++ "/" ++ fp
– todo: below –
subFolderRoute :: Routes
subFolderRoute = customRoute createIndexRoute
  where
    createIndexRoute ident = takeDirectory p </> takeBaseName p </> "index.html"
                           where p = toFilePath ident
dropDateRoute :: Routes
dropDateRoute = gsubRoute "/20[0-9]{2}" $ const ""

dropSiteRoute :: Routes
dropSiteRoute = gsubRoute "site" $ const ""

myPostsRoute :: Routes
myPostsRoute = dropDateRoute `composeRoutes` dropSiteRoute

todo: consolidate the below functions

lexicographyOrdered :: [Item a] -> Compiler [Item a]
lexicographyOrdered items = return $
              sortBy (comparing (takeBaseName . toFilePath . itemIdentifier)) items

recentlyUpdatedFirst :: [Item a] -> Compiler [Item a]
recentlyUpdatedFirst items = do
    itemsWithTime <- forM items $ \item -> do
        updateTime <- getMetadataField (itemIdentifier item) "updated"
        return (updateTime,item)
    return $ reverse (map snd (sortBy (comparing fst) itemsWithTime))

recentlyCreatedFirst :: [Item a] -> Compiler [Item a]
recentlyCreatedFirst items = do
    itemsWithTime <- forM items $ \item -> do
        updateTime <- getMetadataField (itemIdentifier item) "created"
        return (updateTime,item)
    return $ reverse (map snd (sortBy (comparing fst) itemsWithTime))

OK, after that little detour, let’s get back to it! The dateAndTitle function above made use of two helper functions which haven’t actually been defined. The first is readTime, which we use to normalise the date format. It takes a date string and converts it to a UTCTime which we can manipulate.

readTime :: String -> UTCTime
readTime t = fromMaybe empty' . msum $ attempts where
  attempts   = [parseTimeM True defaultTimeLocale fmt t | fmt <- formats]
  empty'     = error $ "Could not parse date field: " ++ t
  formats    = [ "%a, %d %b %Y %H:%M:%S %Z"
               , "%Y-%m-%dT%H:%M:%S%Z"
               , "%Y-%m-%d %H:%M:%S%Z"
               , "%Y-%m-%d %H:%M"
               , "%Y-%m-%d"
               , "%B %e, %Y %l:%M %p"
               , "%B %e, %Y"
               , "%b %d, %Y"
               ]

todo:

titleFromMetadata :: Metadata -> Routes
titleFromMetadata meta = maybe idRoute mkName (getField "title")
  where mkName t    = setBaseName $ slugify t
        getField    = (`lookupString` meta)

The basic idea for the implementation is taken from Hakyll itself, from its getItemUTC which is defined in [Hakyll.Web.Template.Context][hwtc]. Unfortunately, the type signature for that function is quite a lot more complicated than we need, so I’ve extracted the parts we need into a simple String -> UTCTime function here. If the date doesn’t match any of the supported formats readTime will simply crash with an error – not the best error handling but since we’re always going to be running this interactively it doesn’t really matter.

setBaseName turns a string into a FilePath, which it can then manipulate using Haskell’s native replaceBaseName functionality.

setBaseName :: String -> Routes
setBaseName basename = customRoute $
  (`replaceBaseName` basename) . toFilePath
slugify :: String -> String
slugify = intercalate "-" . words . map (\x -> if x `elem`  allowedChars then toLower x else ' ')
  where allowedChars = ['a'..'z'] ++ ['A'..'Z'] ++ ['0'..'9'] ++ " "