#+filetags: indieweb blogging open_web
#+title: Implementing Webmention on my blog
-Following on from my last [[Bring_Back_Blogging][post on joining the indieweb]]...
+Following on from my last [[file:Bring_Back_Blogging.org][post on joining the indieweb]]...
Back in February, I implemented Webmentions on my website. I took a roll-my-own approach, borrowing from [[http://superkuh.com/blog/2020-01-10-1.html][an idea by superkuh]]. It's a semi-automated solution which listens for webmentions using nginx. When (if) one is received, an email is generated that tells me about this, allowing me to validate it's a genuine comment.
than ones opponent. Most interestingly, in both loops it's a common
pitfall to skip the last step - deploying the model / acting.
-#+CAPTION: OODA loop ([[Image by Patrick Edwin Moran - Own work, CC BY 3.0|https://commons.wikimedia.org/w/index.php?curid=3904554]])
+#+CAPTION: OODA loop ([[https://commons.wikimedia.org/w/index.php?curid=3904554][Image by Patrick Edwin Moran - Own work, CC BY 3.0]])
#+ATTR_HTML: :class img-fluid :alt OODA loop
[[file:../assets/OODA-diagram.png]]
be so.
Today, I believe to have stumbled on a description of the style I
-practice (or certainly aim to) most often on [[Adam Drake's
-blog|https://aadrake.com/]]. Its name? [[Mission
-Command|http://usacac.army.mil/sites/default/files/misc/doctrine/CDG/adp6_0.html]].
+practice (or certainly aim to) most often on [[https://aadrake.com/][Adam Drake's blog]]. Its name? [[http://usacac.army.mil/sites/default/files/misc/doctrine/CDG/adp6_0.html][Mission Command]].
(The key alternative being detailed command.)
Now this is an interesting revelation for more than one reason. I
In a 'blast from the past', I sent my first
[[https://www.hixie.ch/specs/pingback/pingback][pingback]] after writing
-the [[previous post|agent based models digital twins]]. A pingback is a
+the [[file:agent_based_models_digital_twins.org][previous post]]. A pingback is a
way for a blogger to send a message to another blogger, informing them
they've written a post that refers to theirs, e.g. as a reply or an
extension of the ideas raised.
The process is a bit more involved than using a
[[https://www.w3.org/TR/webmention/][webmention]], which I've used
-before and [[implemented support for|Implementing webmention on my
-blog]] a while back, due to requiring an XML message to be created
+before and [[file:Implementing_Webmention_on_my_blog.org][implemented support for]] a while back, due to requiring an XML message to be created
rather than a simple exchange of URLs.
First, I created a file =pingback.xml= containing the URLs of the blog
#+title: Spatial indexes to plot income per postal code in Australian cities.
Trying to plot the income per capita in Australia on a map, I came
-across a perfectly good reason to make good use of [[a spatial
-query|http://r-spatial.org/r/2017/06/22/spatial-index.html]] in R.
+across a perfectly good reason to make good use of [[http://r-spatial.org/r/2017/06/22/spatial-index.html][a spatial query]] in R.
I had to combine a shapefile of Australian SA3's, a concept used under
the Australian Statistical Geography Standard meaning Statistical Area
Working in analytics these days, the concept of big data has been firmly
established. Smart engineers have been developing cool technology to
-work with it for a while now. The [[Apache Software
-Foundation|https://apache.org]] has emerged as a hub for many of these -
+work with it for a while now. The [[https://apache.org][Apache Software Foundation]] has emerged as a hub for many of these -
Ambari, Hadoop, Hive, Kafka, Nifi, Pig, Zookeeper - the list goes on.
While I'm mostly interested in improving business outcomes applying
Over the past few weeks, I have been exploring some tools, installing
them on my laptop or a server and giving them a spin. Thanks to
-[[Confluent, the founders of Kafka|https://www.confluent.io]] it is
+[[https://www.confluent.io][Confluent, the founders of Kafka]] it is
super easy to try out Kafka, Zookeeper, KSQL and their REST API. They
all come in a pre-compiled tarball which just works on Arch Linux.
(After trying to compile some of these, this is no luxury - these apps
#+END_SRC
I also spun up an instance of
-[[nifi|https://nifi.apache.org/download.html]], which I used to monitor
+[[https://nifi.apache.org/download.html][nifi]], which I used to monitor
a (json-ised) apache2 webserver log. Every new line added to that log
goes as a message to Kafka.
-[[Apache Nifi configuration|/pics/ApacheNifi.png]]
+#+CAPTION: Apache Nifi configuration
+#+ATTR_HTML: :class img-fluid :alt Apache Nifi configuration
+[[file:../assets/ApacheNifi.png]]
A processor monitoring a file (tailing) copies every new line over to
another processor publishing it to a Kafka topic. The Tailfile monitor
#+title: Using spatial features and openstreetmap
Turns out it is possible, thanks to the good folks at
-[[file:stamen.com][Stamen Design]], to get fairly unobtrusive maps based
+[[http://stamen.com][Stamen Design]], to get fairly unobtrusive maps based
on the OpenStreetMap data.
Combining this with a SLIP dataset from the Western Australian
this makes manipulating shapefiles and the likes easier, as simple
features in R are dataframes!
-[[Plot of secondary schools by student
-population|/pics/Western_Australia_metro_secondary_schools.png]]
+#+CAPTION: Plot of secondary schools by student population.
+#+ATTR_HTML: :class img-fluid :alt Plot of secondary schools by student population.
+[[file:../assets/Western_Australia_metro_secondary_schools.png]]
[[http://git.vanrenterghem.biz/?p=R/project-wa-schools.git;a=summary][Full
details in git]].