A brief history of a tiny part of the Internet.

Blackhole 1 — Blackhole as it was originally known — was written on Python 2.7, briefly supporting Python 2.6 for a time and also supporting early version of Python 3, PyPy 2 and PyPy 3. Built on top of Tornado, it was asynchronous in a fashion and — quite simply — worked.

The original prototype that became Blackhole was SimpleMTA — a prototype that was created quickly, to serve a very simple testing purpose that I had for it.

As I needed SimpleMTA to do more, I wrote Blackhole to accomplish that task. I’d been using Tornado a bit and wanted to experiment with it more. Building on top of Tornado created some oddities in how the program was designed and that always irked me.

Between the time of the last 1.8.X and the 2.0 release, I experimented with …

Batfish is a Python client and API wrapper for the Digital Ocean V2 API. It can be used as a library module in your own Python code but also provides a CLI interface and a shell-like command interpreter.

Batfish is still under development and is considered in the Alpha stage. It is not yet available via PyPI but can be tried out using the code available on GitHub.

There is a small amount of documentation available on Read The Docs and tests are still being written to get as much coverage as possible and eaked out all of the bugs. You can find the latest test status on Travis CI.

Module interface

>>> from batfish import Client
>>> client = Client()
>>> client.authorize("abcde12345")
>>> client.droplets
[<Droplet ego.kura.io>, <Droplet fax.kura.io>, <Droplet jet.kura.io>, <Droplet ski.kura.io>]
>>> client.droplet_reboot(1234)

CLI interface

$ batfish authorize
abcde12345
$ batfish droplets
ego …

Yarg is a PyPI client, it was written for pypip.in and can search packages as well as read the RSS feeds from PyPI for new packages and new package version releases.

Search interface

>>> import yarg
>>> package = yarg.get("yarg")
>>> package.name
u'yarg'
>>> package.author
Author(name=u'Kura', email=u'kura@kura.io')

Newest packages interface

>>> import yarg
>>> packages = yarg.newest_packages()
>>> packages
[<Package yarg>, <Package gray>, <Package ragy>]
>>> packages[0].name
u'yarg'
>>> packages.url
u'http://pypi.python.org/pypi/yarg

Updated packages interface

>>> import yarg
>>> packages = yarg.latest_updated_packages()
>>> packages
[<Package yarg>, <Package gray>, <Package ragy>]
>>> packages[0].name
u'yarg'
>>> packages[0].version
u'0.1.2'
>>> packages[0].url
u'http://pypi.python.org/pypi/yarg/0.1.2

Documentation

Full documentation is at <https://yarg.readthedocs.org>.

As you might expect, pypip.in employes a fair amount of caching in the backend to control load on the imaging API and servers.

For a long time, this cache was entirely managed by Varnish and was doing a fantastic job. Varnish has a hit:miss ratio of 10:1, for every 10 hits we get 1 miss. This is a fairly decent ratio when you consider where these images are displayed, how often they are viewed and that Varnish only caches the images for an hour.

The impact on PyPI

You will firstly need to understand how pypip.in used to work to understand the changes that were made and why they were made.

Let’s set up the request first - a request for a shield is made and it is not present in the Varnish cache.

Request received in API layer
              |
              v
    API layer queries PyPI
              |
              v
   PyPI …

While pypip.in is available under the MIT license on GitHub, it’s not explained how to really use it properly.

You can gather how to set-up the Python source of the project and get the Twisted process running, this is totally reliant on using the img.shields.io.

I decided to write this article explaining how to install your own copy of the shields nodejs code, pypipin itself and even cover off supervisord and Varnish too.

shields & nodejs

nodejs

First of all you’ll need to get the latest source code copy of nodejs from the nodejs download page.

Extract it.

tar -xvzf node-<VERSION>.tar.gz
cd node-<VERSION>

You’ll need to install the build tools, if you don’t have them already.

sudo apt-get install build-essential

And then make and install node.

make && sudo make install

Redis

Redis is used to temporarily store PyPI responses.

sudo …

Supported Python versions

This one is generated from the list of classifiers you provide to PyPI.

If no Python version classifiers exist, it defaults to Python 2.7. This is because really, Python 3 is not widely used in production or supported by libraries.

Python implementation(s)

I think this one is really cool. Chances are you’re unlikely to get more than two supported implementations, like CPython and PyPy or CPython and Stackless.

The shield uses the Python implementation classifiers to generate this shield. It supports all classifiers that PyPI supports (CPython, Jython, Iron Python, PyPy and Stackless) and defaults to CPython is none are set.

Styling changes

This change is simply because of the upgrade of the shields library. This allows us to use the default rounded badges like below.

But also allow you to use a much nicer, cleaner, flat styling like the ones used on this …

Pelican FontAwesome allows you to embed FontAwesome icons in your RST posts and pages.

Installation

To install pelican-fontawesome, simply install it from PyPI:

$ pip install pelican-fontawesome

Then enable it in your pelicanconf.py

PLUGINS = [
    # ...
    'pelican_fontawesome',
    # ...
]

Include the FontAwesome CSS in your base template.

<link href="//netdna.bootstrapcdn.com/font-awesome/4.1.0/css/font-awesome.min.css" rel="stylesheet">

Usage

In your article or page, you simply need to add a reference to FontAwesome and then the icon name.

:fa:`fa-github`

Which will result in:

<span class="fa fa-github"></span>

And the user will see:

You can also increase the size, just like the FontAwesome documentation shows.

:fa:`fa-github fa-4x`

Will result in:

License

MIT license.

Abstract

This proposal describes a build system for generating “wheel” archives and is very, very informal. This plan was drawn up after a random discussion with Jannis Liedel on Twitter and IRC.

Wheel files can be platform and Python-version dependent, a way of generating these files automatically needs to be created and linked to the Packaging Index (PyPI.)

Design

After discussions with Jannis, I believe the simplest solution would likely be the best solution for this problem. As such, I feel that using a custom-built, lightweight solution makes more sense than using something like buildbot.

Technology

I feel the platform should leverage existing Python packages that are tried, tested and well used in the community. Therefore I feel we should use a combination of the following;

  • RabbitMQ for queueing builds
  • Celery for building wheels and
  • pyenv for managing multiple Python versions

Operating systems

I lack any understanding of Windows or …

tugboat-bash-completion is a bash completion script the tugboat CLI interface for the Digital Ocean API.

Downloads

Installation

Debian/Ubuntu

Install manually

Download the source file from above and run the commands below.

sudo make install
. ~/bashrc

Or you can do it the lazy way

sudo wget https://github.com/kura/tugboat-bash-completion/blob/master/tugboat \
    -O /etc/bash_completion.d/tugboat
. ~/bashrc

Notes

It’s worth noting that any command that supports a FUZZY_MATCH will take a small amount of time to respond, due to querying the API for a list of either droplets or images.

Commands that do a droplet lookup;

  • destroy
  • halt
  • info
  • password-reset
  • rebuild
  • resize
  • restart
  • snapshot
  • ssh
  • start
  • wait

An image lookup;

  • destroy_image
  • info_image
  • rebuild

Source

The source can be found on GitHub.

Issues

Issues can be tracked using GitHub Issues.

License

This software is licensed using the MIT License. The license is provided in …

If you’re an Python developer that uses Ubuntu or even Debian, you have probably heard of Felix Krull’s deadsnakes PPA.

I find myself using it a lot and since I tend to destroy my environments quite frequently, I thought I’d save myself some time and write a simple shell script to install the PPA, the versions of Python I frequently use and, after messing up a machine — ignore the existing version of python.

Some time back in April 2013 I was bored and looking for a new project to keep my attention, if only for a short period of time.

My colleague @codeinthehole had an idea but no time to implement it, this idea was to have shields like those of travis-ci (shown below) but displaying package download counts.

Status of blackhole on Travis CI

Tech stack

From the very start I decided to use Tornado framework, although this may change in the future.

The original plan was to generate the images using Pillow (PIL) and then simply cache them to disk. I decided it would make far more sense to do this using Varnish and not have to worry about it working as expected.

Manually generating the images

The images were originally generated from a base template using Pillow, but sadly Python’s image manipulation is not very good, especially it’s text manipulation and the shields could …

I love my prompt, always have and always will. I spend so much of my life in a terminal, usually with half a dozen mini terminals open in each tab. As such I like to tweak it and get it as perfect as possible for my life, needs and even mood.

In the past I’ve had quite a large PS1 that covers multiple lines and gives a lot of information, after having that PS1 in one form or another for some time I decided it was time for a change, to a smaller PS1 that takes up a lot less space.

So here it is, the first image is my standard PS1 when in a git repository, the red @ means a file hasn’t been added to Git, a blue @ means a tracked file has been modified but not stage and finally a green @ means a file is staged …

Over the last week I’ve been doing a huge amount of refactoring of Blackhole as well as writing dozens of additional tests. To make Blackhole more testable I needed to make a big change to how the program is launched and controlled.

setup.py scripts vs. entry_points

Whenever I’ve written Python programs that require some kind of command line script I’ve always used distutils’ scripts, this can be seen in blackhole’s setup.py on GitHub or in the three line example below.

scripts=[
    'blackhole/bin/blackhole',
],

In doing so, it allowed me to be lazy and write a lot of prodecural code in the main “binary” which made it pretty much impossible to test. You can also see that on GitHub in the main “binary”.

I’ve noticed that most people who write Python packages that have some kind of command line entry point use distutils …

Pelican is a Python-powered static blog generator that processes ReStructuredText and Markdown articles and pages and converts them to HTML. I use Pelican to power this blog.

There is a YouTube RST directive built in to Pelican core but it really shouldn’t exist there.

I submitted a pull request for Pelican core to enable Vimeo videos in articles but that request was declined because it didn’t belong in the core. So I decided I would write it as a plugin instead and while I was doing it, also wrote a plugin for YouTube so that it could be removed from the core.

There is a decent amount of detail in the Pelican documentation on how to write plugins, I’m not going to cover the whole process but I thought I would cover a little of what I did.

Adding an RST directive

Really all we’re doing …

Blackhole has always been able to handle unencrypted SMTP and for a long time it’s been able to handle encrypted SMTP via TLSv1.

One thing Blackhole hasn’t been able to do until the 1.7.0 release is handle STARTTLS.

In the past the STARTTLS command would cause Blackhole to return the standard 250 OK response but would continue to operate on unencrypted SMTP.

I wanted to fix this and do it properly, but this meant learning how to do so with Tornado, which itself proved to be tricky. I ended up deciding to go to my local coding spot - the pub and hash it out.

connection_stream

The first thing I had to do was refactor the code that created the instance of tornado.iostream.IOStream and tornado.iostream.SSLIOStream so that it didn’t actually do the ssl wrapping.

def connection_stream(connection):
    """
    Detect which socket the connection …

As part of my effort to make Blackhole as useful and usable as possible, I needed to be able to support SSL/TLS enabled connections.

Tornado itself has two built-in IOStreams that help us do the job; the first is the standard IOStream and the second is the SSLIOStream.

With this in mind we simply need to spawn two sockets, by default these listen on port 25 for standard SMTP and port 465 for SSL/TLS encrypted SMTP. With these two sockets bound we’re then very simply able to listen for incoming connections on either socket and use socket.socket.getsockname() to figure out if the connection is to the encrypted or unencrypted socket.

Code

def connection_stream(connection):
    """
    Detect which socket the connection is being made on,
    create and iostream for the connection, wrapping it
    in SSL if connected over the SSL socket.

    The parameter 'connection' is an instance …