Emit logs in JSON format from fastapi/gunicorn, for DataDog

import logging

from pythonjsonlogger import jsonlogger

def init() -> None:

# Force all loggers to talk JSON rather than text so DataDog can parse the output.
def log_in_json() -> None:
    loggers = [
    for logger in loggers:
        for handler in logger.handlers:
        logger.level = logging.DEBUG
        log_handler = logging.StreamHandler()
        formatter = jsonlogger.JsonFormatter(
            "%(asctime)s %(levelname)s %(name)s %(message)s"

# DataDog is expecting 'status' for log level but the python default is 'levelname'.
def map_levelname_to_status() -> None:
    old_factory = logging.getLogRecordFactory()

    def record_factory(*args: str, **kwargs: str) -> logging.LogRecord:
        record = old_factory(*args, **kwargs)
        record.status = record.levelname  # type: ignore
        return record


That One Blog Post That Gets All the Emails

In 2003 I spent half a day figuring out a bad Microsoft error. I hadn’t been able to Google the answer. And I guess this was before the days of Stack Overflow. So I wrote it up as a blog post – https://bluebones.net/2003/07/server-did-not-recognize-http-header-soapaction/

Because I’d been extremely frustrated by the error I included at the bottom, “If you’re having a similar problem but can’t work what I’m saying here, feel free to mail me on bakert+web@gmail.com – I wouldn’t wish my four hours on anyone!”

I never get email about any of my other blog posts. But Gmail tells me I’ve had at least 269 about this one. Including within the last two weeks! If you find this phenomenon interesting and wish to discuss it further, feel free to email me at bakert@gmail.com 😉

bakert’s 4th Law

The Backlog Never Gets Smaller


  1. All Production Code is Shit
  2. It’s more important to have a standard than what the standard is
  3. TODOs don’t get TODOne

“Perfect” Libraries

Sometimes you use a third party library and the interface is so well designed it’s just effortless. Something that would have been gnarly and murky becomes simple. The kind of library that gets ported to multiple languages because everyone wants access to it.

One slightly obscure example is feedparser, (originally) Mark Pilgrim’s python2 library for reading Atom and RSS feeds. Hiding all this nonsense:

behind a simple interface.

import feedparser 
d = feedparser.parse('http://www.reddit.com/r/python/.rss') 
>>> Python
print d.feed.subtitle 
>>> news about the dynamic, interpreted, interactive, object-oriented, extensible programming language Python 
print d.headers           
>>>  {'content-length': '5393', 'content-encoding': 'gzip', 'vary': 'accept-encoding', 'server': "'; DROP TABLE servertypes; --", 'connection': 'close', 'date': 'Mon, 14 Oct 2013 09:13:34 GMT', 'content-type': 'text/xml; charset=UTF-8'}

Another library that has the same simplicity is Mustache logic-less templates. This one has been ported to literally dozens of languages. Every template I ever worked on was kind of a mess until I found Mustache. It’s actually the restrictions here that make it sing.

Hello {{name}} 
You have just won {{value}} dollars! 
{{#in_ca}} Well, {{taxed_value}} dollars, after taxes. {{/in_ca}}

Some other examples:

  • web.py – Dead simple web framework
  • BeautifulSoup – HTML/XML parser
  • requests – Python library for HTTP
  • humps – Underscore-to-camelCase converter (and vice versa) for strings and object keys in JavaScript (has been ported to Python as pyhumps).
  • Markdown – Text format with HTML representation that has taken over the web due to its simplicity and usefulness compared to actual HTML

Do you know any “perfect” libraries?

GPX to PostGIS, PostGIS to GPX

With ogr2ogr.

export CONN_STRING="host=localhost dbname=DATABASE user=USERNAME password=PASSWORD port=5432"
# Import
ogr2ogr -append -f PostgreSQL PG:dbname=DATABASE_NAME /path/to/your.gpx
# Export
ogr2ogr -f gpx -nlt MULTILINESTRING /path/to/output/tracks.gpx PG:"$CONN_STRING" "tracks(wkb_geometry)"
ogr2ogr -f gpx -nlt MULTILINESTRING /path/to/output/routes.gpx PG:"$CONN_STRING" "routes(wkb_geometry)"
ogr2ogr -f gpx -nlt POINT /path/to/output/waypoints.gpx PG:"$CONN_STRING" "waypoints(wkb_geometry)"

The wkb_geometry references can be replaced with full SQL statements as required.

Short Variable Names in Go

In a recent code review my colleague took issue with the following code.

func Enqueue(properties Properties) (err error) {
	logger := logging.GetLogger(ctx)
	bs, err := json.Marshal(properties)
	if err != nil {
		logger = logger.With().Err(err).Logger()
	} else {
		logger = logger.With().RawJSON("properties", bs).Logger()
        … go on to log some stuff and enqueue the supplied event with the supplied properties …

Specifically the question was around whether `bs` was a reasonable name for the variable holding the JSON version of the properties. My counterargument was that short names are better than long names when well understood and/or short in scope. And that Go has a C influence and favors short variable names which you can see in both the standard library and its examples. The Go encoding/json library calls []byte variously src and data (code), b, j and text (examples) – https://golang.org/pkg/encoding/json/

My colleague said it took them longer 0s to understand the var so they called it out as a nit (not a blocker) and that they care more about knowing what is contained within than if it is `[]byte` or not.

I ended up renaming it `propertiesJSON`. It did start a discussion about short variable names in general and in Go in particular. I’m still not sure how I feel about it. I did find some reading that seemed relevant though.

Notes on Programming in C (Variable names)

Variable names in Go should be short rather than long. This is especially true for local variables with limited scope. Prefer c to lineCount. Prefer i to sliceIndex.

Go Code Review Comments (Variable Names)

Local variables. Keep them short; long names obscure what the code does … Prefer b to buffer.

What’s In a Name? (Local Variables)

Share Clipboard Between Terminal (zsh) and OSX

### System-wide Clipboard mostly from https://gist.github.com/welldan97/5127861

pb-kill-line () {
  zle kill-line
  echo -n $CUTBUFFER | pbcopy

pb-backward-kill-line () {
  zle backward-kill-line
  echo -n $CUTBUFFER | pbcopy

pb-kill-whole-line () {
  zle kill-whole-line
  echo -n $CUTBUFFER | pbcopy

pb-backward-kill-word () {
  zle backward-kill-word
  echo -n $CUTBUFFER | pbcopy

pb-kill-word () {
  zle kill-word
  echo -n $CUTBUFFER | pbcopy

pb-kill-buffer () {
  zle kill-buffer
  echo -n $CUTBUFFER | pbcopy

pb-copy-region-as-kill-deactivate-mark () {
  zle copy-region-as-kill
  zle set-mark-command -n -1
  echo -n $CUTBUFFER | pbcopy

pb-yank () {
  zle yank

zle -N pb-kill-line
zle -N pb-backward-kill-line
zle -N pb-kill-whole-line
# This is too extreme - I often want to wrangle a commandline then paste into it.
#zle -N pb-backward-kill-word
#zle -N pb-kill-word
zle -N pb-kill-buffer
zle -N pb-copy-region-as-kill-deactivate-mark
zle -N pb-yank

bindkey '^K'   pb-kill-line
bindkey '^U'   pb-backward-kill-line
#bindkey '\e^?' pb-backward-kill-word
#bindkey '\e^H' pb-backward-kill-word
#bindkey '^W'   pb-backward-kill-word
#bindkey '\ed'  pb-kill-word
#bindkey '\eD'  pb-kill-word
bindkey '^X^K' pb-kill-buffer
bindkey '\ew'  pb-copy-region-as-kill-deactivate-mark
bindkey '\eW'  pb-copy-region-as-kill-deactivate-mark
bindkey '^Y'   pb-yank