Author Archives: Stubborn Mule

Online music renaissance in Australia

A year or so ago, I complained about the dearth of music streaming services available. A couple of months ago, Spotify launched their service in Australia. Now, five long years after they started blocking Australian users from their service, Pandora has finally re-emerged in Australia. In his email to Australians who signed up for the service all that time ago, Pandora founder Tim Westergren wrote:

You can’t imagine how delighted we are to be able to bring Pandora back to you. We have been busy building the service in the U.S., but never gave up hope that we would someday return.

I, for one, am very happy to have Pandora back.

The power and peril of FRED

FRED” is the St.Louis Federal Reserve Economic Database. It is an excellent repository of economic data, currently boasting 45,000 time-series from 42 data sources. The web-site offers a powerful interface for creating charts of FRED data. Unfortunately, it is a little too powerful, offering a rather dangerous feature: the secondary axis.

I have railed against secondary axes before. They tend to lure the viewer into seeing spurious correlations. Experimenting with FRED, Business Insider has fallen into exactly that trap. In an article entitled “PRESENTING: the ultimate oil currency“, Joe Weisenthal concludes that the euro is surprisingly highly correlated with the price of oil, particularly when oil prices are denominated in gold (OIL.XAU). His evidence is a chart created in FRED (courtesy of the site’s data transformation feature, which allows you to divide the Oil price in US dollars by the price of gold in US dollars).

FRED: oil and euro

Wiesenthal goes on to produce similar charts for the Australian dollar (AUD) and the Canadian dollar (CAD), concluding that they do not track the oil price nearly so well. With superimposed time-series like this, the eye is all too easily fooled into seeing correlations which do not exist. Simply separating the lines goes a long way to dispelling this illusion, as the charts below illustrate.

Small multiple oil plotLooking at these charts, the strongest conclusion you would draw is that the euro and the oil price both went up in 2008, with the caveat that the euro started its run somewhat earlier, and the fell again towards the end of the year. At least you would probably agree with Wiesenthal that the Australian and Canadian dollars do not track the price of oil.

Rather than using two axes when comparing financial price histories, it is better to scale both series to a common value (say 100) at an initial point and plot the results against a single axis. Doing this for the euro and the price of oil shows that the rise in oil prices in mid 2008 was far sharper than that of the euro, as was the fall towards the end of the year.

Index oil and euro

If that chart is not enough to convince you that Wiesenthal’s euro/oil correlation is overblown, perhaps some statistics will help. The absolute price level of the time series is not important. What we need to measure is the correlation of returns (i.e. the percentage change in the prices)*. Daily returns might be a bit noisy, masking any correlations lurking in the data, so I have also calculated correlations for returns over a week (5 trading days) and a month (roughly 20 trading days).

1 day Returns 5 day Returns 20 day Returns
AUD 35% 35% 47%
CAD -35% -34% -46%
EUR 20% 15% 27%

Correlation of Returns to OIL.XAU

The correlation between the euro and the oil price is unimpressive, only reaching 27% for monthly returns. Perhaps surprisingly, it is the Australian dollar that shows the highest correlation to oil. Then again, that is probably only surprising after looking at Wiesenthal’s chart. After all, the Australian dollar is known as a “commodity currency”. But even for the Australian dollar, a 47% monthly return correlation for is not very high.

Once again, the lesson here is to beware of secondary axes. If I was running the FRED site, I would ban the feature immediately.

* The problems with computing correlations between serially correlated time series, such as price data, are well known. See for example Granger and Newbold, “Spurious Regressions in Econometrics” (1974).

Shadow Banking

The Financial Services Authority (FSA) is the banking and financial services regulator in the UK. For now at least.

Back in 2010, the Chancellor of the Exchequer (the equivalent of the Treasury in Australian terms) announced plans to scrap the FSA in response to a failure during the financial crisis of the 10 year old “tri-partite system”. This tri-partite system split responsibility for national financial stability management between the Treasury, the Bank of England and the FSA. The government is now working on shifting  responsibility back from the FSA to the Bank of England, a process which will establish three new regulatory bodies: the Financial Policy Committee (FPC), the Prudential Regulation Authority (PRA) and the Financial Conduct Authority (FCA). More three-letter initialisations and, dare I say it, a new tri-partite system?

Until this process is complete, the FSA continues about its business. The chairman of the FSA is Lord Adair Turner, Baron of Ecchinswell. Turner is also a member of the steering committee of the G20 Financial Stability Board (FSB). In March this year, he spoke at the London CASS business school on the topic of “shadow banking” and its role in the financial crisis.

Shadow banking, a term coined by Paul McCulley in the early days of the crisis, refers to a diverse range of entities such as “structured investment vehicles” (SIVs), hedge funds and money-market funds which have evolved to provide some very similar functions to banks, while not being subject to the same regulatory controls. A nightmare scenario for any bank is a “run”, when too many people try to withdraw their deposits at the same time. Shadow banks can also fall victim to runs. These runs may not be very obvious outside the financial markets, there are no queues of angry depositors on the streets, but they can be just as dangerous and runs on shadow banks were in fact a major factor underlying the global financial crisis. For this reason, regulators like Turner and the FSB are not only focused on strengthening controls on banks, but on better understanding shadow banks and, if possible, subjecting them to regulation to reduce the chances of future financial crises.

So what is it that shadow banks do? To answer that, I’ll first go back to the basics of banking. Although banks have evolved to provide many other products and services, the essence of banking is taking deposits and providing loans. The diagram below illustrates the flow of capital from an investor to a bank and from a bank to a borrower. Having given the bank some money, the investor now has a financial asset in the form of a deposit (and the deposit is a liability from the bank’s point of view). Likewise, the loan now represents a financial asset for the bank (and a liability from the borrower’s point of view). So the bank acts as intermediary between savers and borrowers. In doing so, however, banks act as more than a simple broker matching borrowers and lenders. Most bank lending also involves maturity transformation. More colloquially, this is known as lending long and borrowing short.

Bank Capital Flows

The typical depositor wants their money to be readily available in an at call transaction account. Some may be tempted by higher interest rates to put money in term deposits, usually no longer than 6 months to maturity. On the other hand, most borrowers do not want their loans due and payable too quickly. Home buyers borrow in the expectation that their earnings over coming years will allow them to pay interest and principal on their loans. Likewise, companies making capital expenditure, building factories, buying equipment or acquiring other businesses borrow in the expectation that the revenue generated by their expanded business capability will allow them to repay their loans. In both cases, the term of the loans must match the timeframes over which earnings are generated.

Some lenders will be prepared to make longer term investments, some borrowers may be able to repay more quickly, but overall there is a mismatch in maturity preferences of lenders and borrowers. Banks are in the business of bridging this gap in preferences. In the ordinary course of events, they can allow depositors to withdraw funds before loans are due to be repaid, making use of funds from other depositors, borrowing from other banks or, in need, borrowing from the central bank. But if too many borrowers withdraw at the same time and the bank is unable to meet those demands, then the bank can fail. This is known as liquidity risk, and has become an enormous focus of regulators, risk managers and rating agencies around the world in the wake of the global financial crisis.

While the financial crisis certainly highlighted the dangers of liquidity risk for commercial and investment banks such as Northern Rock and Lehman Brothers, it was outside the traditional banking sector that the greatest liquidity problems arose, particularly as a result of securitisation.

Securitisation is a form of structured finance that predates the financial crisis by many years. Essentially it involves setting up a trust (or similar legal entity) which provides loans that become the assets of the trust (often referred to as a “pool” of loans). The funds to provide these loans are obtained by selling a special kind of bond to investors, known as asset-backed securities (ABS). Principal and interest flowing from the loan pool is collected by the trust and periodically passed through to investors.

ABS capital flows

The most common form of securities bundles up pools of home loans, in which case they are referred to as residential mortgage-backed securities (RMBS).

Unlike bank-lending, there is essentially no maturity transformation involved in financing by means of ABS. Investors cannot withdraw their money early from the trust, they have to wait until it is repaid by borrowers. The only other option for an investor wanting to “liquidate” their investment (i.e. turn it back into cash) is to find another investor to sell their securities to.
The problem with ABS is the overall mismatch of maturity preferences between borrowers and lenders. Without getting into the business of maturity transformation, there was always going to be a limit on how large the market for ABS could become. Faced with a problem like this, it was only a matter of time before innovative financiers came up with a solution. One such solution was asset-backed commercial paper (ABCP). This involves adding another step in the chain, often referred to as a “conduit”. The conduit was simply another legal entity which would buy ABS, funding the purchase by issuing short-dated securities known as asset-backed commercial paper.
ABCP capital flow

Just like a bank, the conduit is exposed to liquidity risk. Before the crisis, this risk was considered fairly low. After all, the assets of the conduit were readily trade-able securities. Most of the time the conduit could repay investors simply by issuing new ABCP to other investors but, in the unlikely event that no such investors could be found, it could simply sell the ABS. In some cases, investors were provided with additional assurance of repayment in the form of “liquidity backstops” provided by banks, essentially a guarantee that the bank would step in to repay investors in need (although these commitments were not always very clearly disclosed to bank shareholders). This whole arrangement was considered highly satisfactory and conduits typically received the highest possible rating from credit rating agencies.

Unfortunately, liquidity risk is a real risk as the world eventually discovered. Once the US mortgage market started to get into trouble in 2007, investors around the world began, quite reasonably, to be rather reluctant to invest in RMBS and other ABS. Prices on these securities began to fall. Managers of large-scale cash investment funds, until then enthusiastic buyers of ABCP, decided that more traditional cash investments were more attractive. The conduits were forced to sell ABS at precisely the time when prices were falling. Their selling pushed prices down further in a vicious cycle, a perfect illustration of the close relationship between funding liquidity risk (the risk of not being able to repay obligations) and market liquidity risk (the risk of being unable to sell financial assets at anything other than a painfully low price). As a result, some conduits were rescued by the banks backing them (“taking them back on balance sheet”), while others collapsed.

The problems of ABCP were just one example of non-bank liquidity failures during the financial crisis. Others include the venerable US money market fund, the Reserve Fund “breaking the buck” or Australian non-bank lender RAMS finding itself unable to continue funding itself by means of “extendible commercial paper” (ECP).

ABCP conduits, money-market funds, non-bank mortgage lenders along with many other non-bank financiers that make up the shadow banking sector had well and truly entered the business of maturity transformation and are all exposed to significant liquidity risk as a result. There are many linkages between banks and these shadow banks, whether through commitments such as liquidity backstops, direct lending or even partial or complete ownership. Regulators are concerned that too much risk in the shadow banking sector means too much risk for banks and too much risk for the financial system as a whole.

One strategy for regulators is to enforce a cordon sanitaire around banks, protecting them from shadow banks. But many, including Lord Turner, worry that is not enough to protect our global financial system with its complex interconnections from damage when shadow banks fail. Ideally they would like to regulate shadow banks as well, preventing them from running too much liquidity risk. But this is not an easy task. As the name suggests, it is not easy to see what is going on in the world of shadow banks, even for well-informed financial regulators.

Spotify in Australia

A very short post today!

Finally one of the major music streaming services is available in Australia (I have complained about the limited options for music streaming down under here before). Spotify has now launched in Australia. I have signed up to investigate the service, so it is a bit early to give an opinion on how good it is, but I know it is very popular in Europe and the US.

I would like to see other services opening up in Australia, including Slacker, Rhapsody and Pandora, but given that all of these have been around for longer that Spotify, I will not be holding my breath.

Problem Pies

Last month the IMF published their latest Global Financial Stability Report. A colleague, who knows I rarely approve of pie charts*, drew my attention to the charts on page 27 of Chapter 3 of the report, which I have reproduced here (click on the image to enlarge). 

Here the authors of the report have decided to attempt some graphical improvisation, taking the pie chart and extending it. Over time some inspired new chart designs have been developed, but these have been rare. More often the result is inferior to using an established technique. While I do not wish to discourage innovation, the results should always be tested before being foisted on an unsuspecting audience.

The aim of this pair of charts is to illustrate the dwindling supply of “safe assets” in the form of highly rated sovereign debt as a result of the global financial crisis. For example, at the end of 2007, 68% of advanced economies boasted a AAA Standard & Poor’s credit rating (left hand chart, outer red arc) but  by January 2012 this proportion had fallen to 52% (left hand chart, inner red sector).

The heart of each chart is a conventional pie chart showing the current distribution of country ratings. Taken in isolation, either one of these would be a reasonable chart. But moving beyond a single pie chart, comparing the Advanced Economies chart to the Emerging Markets chart is not so easy. Edward Tufte’s adage from The Visual Display of Quantitative Information comes to mind: “the only worse design than a pie chart is several of them”. The crime against charting here is made particularly egregious with the choice of a colour scheme for ratings that is not consistent across the two charts!

If that wasn’t bad enough, the design comes right off the rails with the outer charts. These are a form of annular pie chart, but the alignment of each segment is shifted in an attempt to make the pre-crisis figure more readily comparable to the post-crisis figure for each rating. The result is highly confusing: it takes a while to work out exactly what is going on. Messing with the alignment of the outer chart also makes it harder to compare one rating to another. Even the decision to position the 2012 data in the middle and the 2007 data on the outside is a mistake. My eye expects a flow from the centre of the circle outwards rather than from outside in. An informal, if statistically insignificant, survey suggests that I am not the only one with this expectation.

The aim of any data visualisation is to provide easy access to the information. Understanding the IMF report’s chart is just too much work. A simple table of figures would have been easier to understand. But there are also more conventional charts that would do a better job. The chart below is an example of the “small multiples” technique. This involves a grid of similar charts which are readily compared as certain parameters are varied. In this case, scanning the charts horizontally reveals changes through time and vertically the differences between advanced economies and emerging markets.

Sovereign ratings from before the crisis (2007) to now (2012)

Some space could have been saved by restricting the vertical axis to a 0% to 70% range, but with the full 0% to 100% range the proportions for each rating are more readily grasped.

The small multiples chart is a vast improvement on the IMF original, and is a good illustration of the fact that choosing the right chart makes it far easier to visualise the patterns in your data.

* One of the few pie charts I do approve of is this one (I have seen this one in many places, but I am not sure of the original source).

Goodhart’s Law

Another post and another Law, but this time no mathematics is involved.

Imagine you are running a team of salespeople and, as a highly motivated manager, you are working on strategies to improve the performance of your team. After a close study of your team’s client call reports you realise that the high performers in the team consistently meet with their clients more frequently than the poor performers. Eureka! You now have a plan: you set targets for the number of times your team should meet with clients each week. Bonuses will depend upon performance against these targets. Confident that your new client call metric is highly correlated with sales performance, is objective and easily measurable, you sit back and wait.

Six months later, it is time to review the results. Initially you are pleased to discover that a number of your poor performers have achieved very good scores relative to your new targets. Most of the high performers have done well also, although you are a little disappointed that your best salesperson came nowhere near the “stretch target” you set. You then begin to review the sales results and find them very puzzling: despite the high number of client meetings, the results for most of your poor performers are worse than ever. Not only that, your top salesperson has had a record quarter. After you have worked out whether you can wriggle out of the commitment you made to link bonuses to your new metric, you would do well to reflect on the fact that you have fallen victim to Goodhart’s Law.

According to Goodhart’s Law, the very act of targeting a proxy (client meetings) to drive a desired outcome (sales performance) undermines the relationship between the proxy and the target. In the client meeting example, the relationship clearly broke down because your team immediately realised it was straightforward to “game” the metric, recording many meetings without actually doing a better job of selling. Your highest performer was probably too busy doing a good job to waste their clients’ time with unnecessary meetings.

The Law was first described in 1975 by Charles Goodhart in a paper delivered to the Reserve Bank of Australia. It had been observed that there was a close relationship between money supply and interest rates and, on this basis, the Bank of England began to target money supply levels by setting short-term interest rates. Almost immediately, the relationship between interest rates and money supply broke down. While the reason for the breakdown was loosening of controls on bank lending rather than salespeople gaming targets, the label “Goodhart’s Law” caught on.

Along with its close relatives Campbell’s Law and the Lucas Critique, Goodhart’s Law has been used to explain a broad range of phenomena, far removed from its origins in monetary policy. In 18th century Britain, a crude form of poll tax was levied based on the number of windows on every house. The idea was that the number of windows would be correlated with the number of people living in the house. It did not take long for householders to begin bricking up their windows. A more apocryphal example is the tale of the Soviet-era nail factory. Once central planners set targets for the weight of nail output, artful factory managers met their target by making just one nail, but an enormous and very heavy nail.

Much like the Law of Unintended Consequences, of which it is a special case, Goodhart’s Law is one of those phenomena that, once you learn about it, you cannot help seeing it at work everywhere.

Benford’s Law

Here is a quick quiz. If you visit the Wikipedia page List of countries by GDP, you will find three lists ranking the countries of the world in terms of their Gross Domestic Product (GDP), each list corresponding to a different source of the data. If you pick the list according to the CIA (let’s face it, the CIA just sounds more exciting than the IMF or the World Bank), you should have a list of figures (denominated in US dollars) for 216 countries. Ignore the fact that the European Union is in the list along with the individual counties, and think about the first digit of each of the GDP values. What proportion of the data points start with 1? How about 2? Or 3 through to 9?

If you think they would all be about the same, you have not come across Benford’s Law. In fact, far more of the national GDP figures start with 1 than any other digit and fewer start with 9 than any other digit. The columns in the chart below shows the distribution of the leading digits (I will explain the dots and bars in a moment).

Distribution of leading digits of GDP for 216 countries (in US$)

This phenomenon is not unique to GDP. Indeed a 1937 paper described a similar pattern of leading digit frequencies across a baffling array of measurements, including areas of rivers, street addresses of “American men of Science” and numbers appearing in front-page newspaper stories. The paper was titled “The Law of Anomalous Numbers” and was written by Frank Benford, who thereby gave his name to the phenomenon.

Benford’s Law of Anomalous Numbers states that that for many datasets, the proportion of data points with leading digit n will be approximated by

log10(n+1) – log10(n).

So, around 30.1% of the data should start with a 1, while only around 4.6% should start with a 9. The horizontal lines in the chart above show these theoretical proportions. It would appear that the GDP data features more leading 2s and fewer leading 3s than Benford’s Law would predict, but it is a relatively small sample of data, so some variation from the theoretical distribution should be expected.

As a variation of the usual tests of Benford’s Law, I thought I would choose a rather modern data set to test it on: Twitter follower numbers. Fortunately, there is an R package perfectly suited to this task: twitteR. With twitteR installed, I looked at all of the twitter users who follow @stubbornmule and recorded how many users follow each of them. With only a relatively small follower base, this gave me a set of 342 data points which follows Benford’s Law remarkably well.

;

Distribution of leading digits of follower counts

As a measure of how well the data follows Benford’s Law, I have adopted the approach described by Rachel Fewster in her excellent paper A Simple Explanation of Benford’s Law. For the statistically-minded, this involves defining a chi-squared statistic which measures “badness” of Benford fit. This statistic provides a “p value” which you can think of as the probability that Benford’s Law could produce a distribution that looks like your data set. The follower-count for @stubbornmule is a very high 0.97, which shows a very good fit to the law. By way of contrast, if those 342 data points had a uniform distribution of leading digits, the p value would be less than 10-15, which would be a convincing violation of Benford’s Law.

Since so many data sets do follow Benford’s Law, this kind of statistical analysis has been used to detect fraud. If you were a budding Enron-style accountant set on falsifying your company’s accounts, you may not be aware of Benford’s Law. As a result, you may end up inventing too many figures starting with 9 and not enough starting with 1. Exactly this style of analysis is described in the 2004 paper The Effective Use of Benford’s Law to Assist in Detecting Fraud in Accounting Data by Durtshi, Hillison and Pacini.

By this point, you are probably asking one question: why does it work? It is an excellent question, and a surprisingly difficult and somewhat controversial one. At current count, an online bibliography of papers on Benford Law lists 657 papers on the subject. For me, the best explanation is Fewster’s “simple explanation” which is based her “Law of the Stripey Hat”. However simple it may be, it warrants a blog post of its own, so I will be keeping you in suspense a little longer. In the process, I will also explain some circumstances in which you should not expect Benford’s Law to hold (as an example, think about phone numbers in a telephone book).

In the meantime, having gone to the trouble of adapting Fewster’s R Code to produce charts testing how closely twitter follower counts fit Benford’s Law, I feel I should share a few more examples. My personal twitter account, @seancarmody, has more followers than @stubbornmule and the pattern of leading digits in my followers’ follower counts also provides a good illustration of Benford’s Law.

One of my twitter friends, @stilgherrian, has even more followers than I do and so provides an even larger data set.

Even though the bars seem to follow the Benford pattern quite well here, the p value is a rather low 5.5%. This reflects the fact that the larger the sample, the closer the fit should be to the theoretical frequencies if the data set really follows Benford’s Law. This result appears to be largely due to more leading 1s than expected and fewer leading 2s. To get a better idea of what is happening to the follower counts of stilgherrian’s followers, below is a density* histogram of the follower counts on a log10 scale.

There are a few things we can glean from this chart. First, the spike at zero represents accounts with only a single follower, accounting around 1% of stilgherrian’s followers (since we are working on a log scale, the followers with no followers of their own do not appear on the chart at all). Most of the data is in the range 2 (accounts with 100 followers) to 3 (accounts with 1000 followers). Between 3 and 4 (10,000 followers), the distribution falls of rapidly. This suggests that the deviation from Benford’s Law is due to a fair number users with a follower count in the 1000-1999 range (I am one of those myself), but a shortage in the 2000-2999 range. Beyond that, the number of data points becomes too small to have much of an effect.

Histogram of follower counts of @stilgherrian’s followers

Of course, the point of this analysis is not to suggest that there is anything particularly meaningful about the follower counts of twitter users, but to highlight the fact that even the most peculiar of data sets found “in nature” is likely to yield to the power of Benford’s Law.

* A density histogram scales the vertical axis to ensure that the histogram covers a region of area one rather than the frequency of occurrences in each bin.

Pressure Drop

On Saturday night I found myself in Melbourne at the first live performance in 30 years of the reggae band Pressure Drop.

The last time Pressure Drop played I probably couldn’t have told you what reggae was. Although I would certainly have heard Eddy Grant’s Electric Avenue on the radio, I was a New Romantic at heart and was more inclined to listen to the likes of Adam and the Ants. At this point, in the interests of full disclosure, I should admit that Pressure Drop’s concert was the second of the weekend for me: I also saw Adam Ant at the Enmore Theatre the night before.

Since my New Romantic period, I have learned to appreciate reggae and dub, but that doesn’t explain why I came to be in the audience at the Caravan Music Club. The real reason was that Pressure Drop’s guitarist is Bill Mitchell, macroeconomist, Modern Monetary Theory (MMT) pioneer and the man behind Billy Blog.

Bill and I first came into contact in 2008 when we both blogged about alternative Olympic medal tallies (my post is here and Bill’s is here). I then became very interested in Bill’s expositions on the importance a country’s monetary system has for understanding the possibilities for fiscal policy. Bill’s ideas were the inspiration of many of my posts, such as Blame Greece’s Debt Crisis on the Euro and Park the Debt Truck and I eventually came to know Bill in person after attending a couple of CofFEE conferences.

Bill’s academic interests extend from the workings of money, to concerns about full employment and equity (my only podcast to date featured Bill explaining his idea of a job guarantee, which is underpinned by an understanding of MMT). Since reggae has its roots in concerns about social justice, it came as no real surprise to learn that Bill had once played reggae. Then last year the band started rehearsing again and these rehearsals culminated in the release of a new CD, aLIVE 2011. I was keen to hear the results and I suspect I was one of the first to order a copy.

Pressure Drop

Although Bill is based in Newcastle these days, Pressure Drop had always been a Melbourne band, so it was only natural that their return performance should be in Melbourne. I had been planning a trip to Melbourne for some time and fortuitously it coincided with the night of the concert. So it was that I found myself out at Oakleigh RSL on Saturday night.

Bill Mitchell on guitar with Pressure Drop

Before Pressure Drop and after I beat my brother at a game of pool, there was a two piece support act: Ross Hannaford (of Daddy Cool fame) and Bart Willoughby, the drummer from No Fixed Address. I would hate to imagine how many years Brian has being playing guitar, but he is certainly able to coax beautiful sounds from those strings, apparently with no effort at all.

Once Pressure Drop came to the stage, it wasn’t long before dancing began. Bill had plenty of reverb on his guitar, but not so much as to hide the fact that he can play as well as he can blog (more than can be said for me). The band had a house full of enthusiastic supporters and they did not disappoint. The concert was great fun for all of us on the dance floor and it looked as though the band enjoyed themselves just as much. Don’t be surprised if Bill trades his professorial chair for a Fender on a permanent basis.

With the iPhone in hand, I managed to get a few photographs (the band has posted more here) and even a couple of brief video clips which I have crudely spliced.

I’ll be keeping an eye out now to see whether Pressure Drop decide to take the band on the road and play in Sydney. I can’t be sure when I’ll be in Melbourne next.

UPDATE: here is a video of the entire concert (1:40).

Bitcoin revisited

Just over a year ago, I wrote about the digital “crypto-currency” Bitcoin. It has been an eventful year for Bitcoin.

Designed to provide a secure yet anonymous, decentralised means for making payments online, the first Bitcoins were virtually minted in 2009. By early 2011, Bitcoin had begun to attract attention. Various sites, including the not-for-profit champion of rights online, the Electronic Frontier Foundation (EFF), began accepting Bitcoins as payment. But when Gawker reported that Bitcoins could be used to buy drugs on “underground” website Silk Road, interest in the currency exploded and within a few days, the price of Bitcoins soared to almost $30.

This kind of attention was unwelcome for some, and shortly afterwards EFF announced that they would no longer be accepting Bitcoins, fearing that this would be construed as an endorsement of the now controversial currency. Around the same time, the first major theft of Bitcoins was reported and the Bitcoin exchange rate fell sharply.

Bitcoin price history

Bitcoin Exchange Rate

More recently, another high-profile theft has caused ructions in the Bitcoin economy, prompting e-payments provider and PayPal competitor, Paxum, to abandon the Bitcoin experiment, which in turn forced one of the larger Bitcoin “exchanges” to shut down. The anonymity of Bitcoin is a design feature, but it also makes it almost impossible to trace thieves once they have their virtual hands on Bitcoins.

How much damage this does to the fledgling currency remains to be seen, but it certainly makes for a volatile currency. The free-floating Australian dollar is a reasonably volatile real-world currency but, as is evident in the chart below, Bitcoin volatility is an order of magnitude higher. That in itself is reason enough for any online business to think twice about accepting Bitcoins.

Bitcoin volatilityRolling 30 day volatility (annualised)

Whatever its future, Bitcoin is a fascinating experiment and, even if it does not survive, digital currencies of one form or another are surely here to stay.

Data sources: Bitcoin charts, Bloomberg.

Bristol Pound

Recently, a colleague drew my attention to the “Bristol Pound“, an example of a “local currency“.   Ah yes, I said, that’s been around for a few years now. Embarrassingly, I later realised I was thinking about the “Brixton Pound“. Having attended many concerts at the legendary Brixton Academy (Nick Cave, Ministry and the Sugarcubes among them), I really should have known the difference between Bristol and Brixton!

There are now a number of local currencies in Britain. The first to appear in recent years was the  “Totnes Pound“, launched in March 2007. According to their website, the benefits of the Totnes Pound are:

  • To build resilience in the local economy by keeping money circulating in the community and building new relationships
  • To get people thinking and talking about how they spend their money
  • To encourage more local trade and thus reduce food and trade miles
  • To encourage tourists to use local businesses

The aims of the Brixton Pound, the Bristol Pound and the other local currencies are essentially the same. As far as I can tell, the take up of these currencies to date has been modest, but the Bristol Pound represents an interesting new development. Not only does it have a far slicker website, but it also offers payment by mobile phone. Perhaps most significantly, according to the FAQ, “Business members that pay business rates to Bristol City Council will be able to pay in Bristol Pounds.”

A key tenet of “Modern Monetary Theory” is that the value of fiat money is not underpinned by gold or any other commodity; rather its value derives from the government levying tax in that currency. Since almost everyone has to pay tax at some point, this creates a base level of demand for the currency. So, perhaps the fact that the Bristol City Council is supporting the Bristol Pound will enhance its take-up prospects. It would be even more interesting if the council decided that they would only accept Bristol Pounds as payment for rates.