The Misbehavior of Markets

August 6th, 2010 by Potato

I just finished reading The Misbehavior of Markets by Benoit Mandelbrot.

It’s a concept book: it’s not going to tell you how to better manage your money or anything of the sort. There aren’t a lot of concrete examples, either. However, it discusses a number of important concepts, especially the important ways that we model the markets differ from reality. Models are of course important for being able to manipulate variables, to try to forecast the future, or to simplify things to get a better understanding. However, relying too much on models that are not accurate can bite us in the ass. “Clouds are not spheres, mountains are not cones, coastlines are not circles, and bark is not smooth, nor does lightning travel in a straight line.”

First, a quick bit in the way of introduction for those that don’t have a strong stats background. The Gaussian, or normal distribution is one that appears in a lot of places in nature (so much so that we call it the “normal” one!), even if you’re not very familiar with it, you must have seen the familiar bell-curve shape:

Bell curve distribution, image from Wikipedia

It looks a little different for every group of things you choose to measure. For instance, the lengths of pencils in a fresh box may have a very “sharp” bell curve, since each one is very nearly identical in length, varying by just fractions of a millimetre. The heights of adults in the population will be a bit wider, with some people towering head and shoulders above others. The heights of kids in a school may have a wider one yet, with some being young, some passing through their growth spurt. But for all these disparate measurements, you can describe the way their bell curve looks with just two numbers: the mean, and the standard deviation (σ). The mean tells you where the centre of the curve is, and the standard deviation how wide it is.

A lot of work has gone into investigating and characterizing these distributions. A very large part of the whole field of statistics is based on manipulating the properties of the normal distribution. There are countless tools out there to help you simulate or analyze data based on normal distributions. In addition to being common, the math is also fairly well-behaved, lending itself to analytical as well as numerical approaches.

So it may be no surprise then that a great number of tools exist for financial matters based on gaussian distributions. For example, if you go to a retirement planner, and they have a fancy program that will show you the various possible outcomes for how much money you’ll have to retire on based on how much you save, saying things like “95% of the time, you’ll have at least $X available per year until you turn 100” — those software tools are based on simulating possible market outcomes with this math. Options traders have formulas to tell them how much their derivatives should be worth based on these formulas. Etc.

Yet, as Mandelbrot eloquently shows in the book, markets do not follow a Gaussian distribution. There are far too many wild swings than should be seen in a normal distribution. These swings, such as the 1929 stock market crash that kicked off the Great Depression, the 1987 Black Monday crash, the Tech Wreck, the 2008/9 global financial crisis — they simply shouldn’t happen if the markets were following a random walk with the volatility implied by a normal distribution. The standard models say that these types of events should be so rare as to be essentially impossible, like finding a human closing on the 10′ mark for height — basically, never in the course of human history should any of these events have happened, but there they are, and more that I haven’t listed, all within about one human lifespan. This volatility is, unfortunately, an inescapable feature of the markets.

Other core assumptions of the standard theories are also not true. For example, that price movements are independent. But, we know that volatility clusters. This is even something seen in nature, away from the madness of human market psychology: if you plot out the size of a river’s spring flood each year for hundreds of years, it will trace a nice bell-curve. If, using the information of that bell curve you were to make a dam so that, 99 years out of the 100 you expect the dam to last, the reservoir would be big enough to allow enough water to flow downstream to provide a water supply equal to the average spring flood, you’d find your dam would be too small. That’s because if the volatility clusters, it throws the calculations off. And that’s exactly what happens when you find that one dry year follows another follows another, while the wet ones cluster up together too — more often than should be happening if each was truly an independent event, as with games of chance in a casino.

Forgetting this assumption is one of the many factors that lead to the subprime crisis in the states. They had fairly decent models telling them how likely it was that someone with bad credit they were giving a mortgage to would default. These financial warlocks could then figure out how to tranche the loans out in a big securitization package so that the top tranche would not experience any defaults (and then later what interest rate they’d need to offer to get people to take a chance on the lower tranches). However, what they didn’t account for is the clustering of volatility — when the loans went bad, they all went bad at the same time.

And the third important way that real markets differ from our models of markets is that prices are not continuous. Many real physical things are continuous. It’s not possible for me to teleport my keyboard from the top of my desk to the floor. It must — however briefly — occupy each point along the height axis as it falls. Much of our mathematics is based on continuous functions. Markets, however, are not real, physical things. They are not continuous. However, the models assume that they are. For most retail investors, this is one of the less devastating distinctions from accepted theory: so what if prices take discrete jumps? For the fancy traders trying to limit risk though, non-continuity can break some of their techniques. One common one is the stop-loss order. You set a point at which you will not tolerate further drops in a stock’s price. If the stock is falling and hits your stop-loss price, it automatically gets sold by your broker. This can help you limit your downside.

But look at Manulife today. Let’s say a trader owned MFC and had a stop-loss set at $15.50, to limit their downside. This morning the results came out, and they were bad. The stock, which closed yesterday at $16 opened for trading today at $14.87. At no point was there ever an opportunity to sell it in the $15 range.

Manulife's discontinuity on Aug 5, 2010, adapted from Google Finance

So, before I get off ranting about the concepts themselves, I’ll say that the book was very approachable, even if you don’t have much of a statistical bent. Mandelbrot uses graphs well to get his point across, and tries to keep things non-technical (and there is next to no math). However, it is a concept book — it’s not going to tell you how to build a better portfolio or manage risk better.

Now, having just spent some time discussing the idea that markets are more volatile than we thought, and that they are not efficient, what do I have to say?

Well, despite the volatility, for the long run the stock market has still been basically the best place to keep your money. In the short run, the volatility can be very painful (as we’ve seen recently). But, if you have many decades to wait, and believe that businesses will continue to be profitable, then you should get your money to work for you in the market. So, as someone still fairly young (despite my looks as science has ravaged my body), I have a very high equity allocation. However, despite agreeing in theory with the concept of lifecycle investing and using leverage while you’re young, I’ve never been comfortable with the idea in practice, and have never been very leveraged (and not at all at the moment).

Efficient market theory says that market participants are rational, have (roughly) equal access to information, and that information about past price changes does not enable you to predict future price changes. (The esoteric proofs of some of this rely on the normal distribution). This leads one to conclude that investing in index funds is the way to go, since beating the market is impossible. Now, I don’t believe in full efficient market theory. In particular, people are not rational, and all information is not widely and universally shared (or understood!). That’s why part of my investments are “actively” managed: I invest in particular companies I think have a good chance of beating the overall market. However that said, I do believe in a weaker form of efficient markets, which is that the average person can’t beat the market net of fees, so although I talk about my stock tips and analysis here on the blog for discussion and feedback, the only recommendation I ever give people who ask is to invest in a low-cost index fund, such as TD’s e-series funds or an iShares ETF (and part of my investments are in e-series funds).


Solar Storm

August 4th, 2010 by Potato

A quick note for those that haven’t already heard that there’s a solar storm going on right now, which should make the aurora borealis (northern lights) visible from much further south than normal (i.e., southern Ontarians have a chance at seeing them!). The opportunity should last for another day or two, but unfortunately the terrestrial weather isn’t behaving with the space weather: it was overcast and hazy here last night, and tonight might be more of the same… If you’ve got clear skies, and can get away from the light pollution, it may be worth taking a look up into the night sky.

Tater’s Takes

August 3rd, 2010 by Potato

Another bad week for exercising, but the diet was at least a bit better.

A NYT story on credit scores suggests that the pendulum has swung too far in the states, and now it’s becoming hard for even borrowers with decent credit to get a loan. Some mortgage brokers are lamenting that too much weight is being put on the FICO score:

In fact, FICO scores are not the best predictor. The amount of equity a person has in his home, his debt-to-income ratio, his job stability and his cash reserves are all better predictors than credit scores, according to Dave Zitting, the chief executive of Primary Residential Mortgage, a leading mortgage lender.

Now from what I’ve read I don’t know if I’d say the amount of debt-to-equity is a better predictor, but it’s certainly up there. This just reinforces my earlier point that the line “Canada doesn’t have a subprime mortgage problem” is glossing over the prevalence of CMHC-supported low/no downpayment loans, which while not quite as risky as a pack of NINJA negative-amortization loans, are still much riskier than the “conservative” banking culture played up in the media.

Aside from seeing another source to use to trot out my ongoing argument, the article isn’t all that good. It belabours the point that lending criteria use hard cut-offs sometimes (like here with FICO scores) where the difference between just over and just under the line are too small to be meaningful. Unfortunately, life is full of such arbitrary cut-offs: for instance, if you have $499 in your account but write a cheque for $500, it’s bounced all the same as if you had nothing in the account. These cut-offs can help protect the larger system (e.g., the bank) from bad risks from decisions made throughout the organization. Though the article didn’t mention it, there is one solution to the issue: make the cut-offs continuous rather than binary. Rather than someone with a credit score of 620 getting a loan and someone with 619 getting nothing, scale it in so that someone with a 650 could get a full loan right up to their debt service level of say 32%, while someone with a 620 could only get 20-some percent, and scale it down to zero over a wider range… But that’s nitpicking the point.

Rogers has tightened up their download limits again, just days after Netflix announced it was coming to Canada. Whispers of market manipulation to shut out a competitor to their own video-on-demand service arise.

An older article from the CBC goes over some of the basics of download limits, and some of the anti-competitive issues. As you all well know, I think the companies are BSing us here — first off, their “average user’s usage” figure hasn’t moved in years, despite the prevalence of things like streaming video in the last few years. I think it’s probably way out of date now, likely a factor of 10 too low.

the company says the caps were necessary because between five to seven per cent of its customers were using more than 80 per cent of its bandwidth, thus slowing service down for everyone.

This is an argument trumped out often in favour of caps or limits, but what does it really mean? Years ago, during the first round of ISPs cracking down on heavy users, these sorts of arguments were used to cut “abusers” off… but this sort of relationship is just a feature of how humans distribute resources. It’s the Pareto principle. Plus of course, data transfer is very cheap, on the order of cents per GB, yet the overage charges are $2/GB. Even with a healthy extra put in there to act as a disincentive, this is clearly a massively profitable area for ISPs, way beyond the costs of data transfer or economic disincentive. The real issue they often complain about with their networks is peak usage, i.e. time-of-use (especially Rogers’ architecture). Yet they’ve taken no steps towards time-of-use billing, even though that would make more sense.

On the StarCraft 2 front I ended up using one of my guest passes since it’s just getting ridiculous that they can’t solve the account problem that’s kept me from playing. I finished the single-player campaign, and found it quite short. There were 26 missions (Terran and a few Protoss side missions), which compares well to the number of missions in the first StarCraft. However, I found the missions to be very fast and small-scale. I don’t think there was any one level that took me more than 30 minutes to clear, whereas I remember at least one level per race taking over an hour in the original as you had to carefully pick your way through the enemy forces and win by attrition sometimes. Detaching the single-player a bit more from the multiplayer did add some neat options with unit upgrades and mercenaries, as well as a greater spectrum of units (e.g., the medic, wraith, and goliath were cut from multiplayer). Still haven’t played multiplayer though. I tried calling the support line a few times last week, only to find that I couldn’t even get into the holding pattern since the queue was full. When I did finally get through, though the fellow was nice, the problem didn’t get solved. I sent an email right away on release day, and found it ironic that the message telling people to call again since the hold queue was full suggested emailing support instead. Finally, 6 days later, a rep has gotten back to me, and after some back-and-forth going through the motions of trying steps that everyone in the support forums said didn’t help (and that I already tried on my own), it looks like I should get my account fixed tomorrow (8 days after release). Update: Just got in, woo-hoo! Now I’m too tired to play though and have to go to bed…

Iomega ScreenPlay Plus

August 1st, 2010 by Potato

These Rogers bills are killing us. Ok, financially, we can afford $35/mo, but it just seems so silly to spend that much (now with HST!) for basic, analog cable. We don’t see the value in the higher TV packages, and especially not in digital. Especially now that there are so many shows offered on demand online (e.g., Jon Stewart on the Comedy Network, all of BNN’s programming). I haven’t watched TV in like 2 years now, and don’t have any kind of cable at my place in London. Wayfare, unfortunately, stipulates that she must be able to put something on in the background while she veges in the living room, and hasn’t wanted to give up cable.

For a while I was using my Xbox with TVersity to stream shows from the computer, but that was a little cumbersome because it meant the computer had to be up-and-running with TVersity going, etc. Wayfare wants something simpler.

So I was encouraged to see the Iomega ScreenPlay Plus HD Multimedia Player come up as an option. It’s cheap: $150, which includes a 1 TB hard drive, and promises to be easy to use, plugging right into the TV and playing most of the common digital video formats.

First up, right out of the box it looks well put-together: there are a few pieces of foam to shockproof the drive, and the cables are in baggies, but other than that the box is split into 2 neat compartments without a lot of wasted packaging. Best of all, none of that ridiculous hard-shell packaging crap that drives me nuts. It has a remote control for operating the player part of the system, and yes, it comes with batteries. It also comes with a (rather short) USB cable, and composite and analog AV cables. Nice.

I can hook it up to my computer just like any other external USB drive, and load it up with files.

Then, it also has another USB port on the front so you can plug in another USB drive and just use the player function: something I already know will come in handy since Wayfare was a little hesitant about downloading shows and then having to unhook the drive from the TV to load them on it. She can just download them to a key or other drive, and play from there.

The operation with the remote and viewing content on the TV is straightforward. Unfortunately, the file list is as “large tiles” where the tiles are unhelpful icons just showing that you are looking at a video file. Lots of wasted screen real estate there. Only 7-8 characters of the filename show up under the large icons, with a fuller name at the bottom. Weirdly enough, even though the name at the bottom only takes up half the screen, it still truncates after 30 characters or so. I tested a few DIVX files and they all played fine — some looked great, but others had a fair bit of graininess/artifacts to them. Not sure why the difference, but those same files I’m pretty sure showed up fine on the Xbox/TVersity combo. I may have to investigate the video quality a bit more… One file that absolutely refused to play over TVersity played the video, but not the audio, so it doesn’t look like there’s any more capability there. OGG and MKV files don’t even appear in the file list as options, just AVIs and WMVs.

With composite and HDMI outputs it does have high def capabilities, but I didn’t test those out.

So from a video player point of view, it does work out of the box, but looks like it could use a few more refinements yet. Oh, one other weird bug is that it sorts capital and lower-case letters separately (i.e.: it would order a list of shows like Alpha Delta Echo November Zulu alpha bravo delta… etc.), which is just not right.