I have just started reading a book by Tim Carney entitled “The Big Ripoff” and he makes many of the points I have made time and again – the effects of regulation are generally not the ones you would expect.
The fans of regulation are generally taken to be opponents of “Big Business” and they call for more regulation as a counter to the (alleged) pernicious effects of having large companies. Most of these effects are easy to see – strong pricing power, reduced service and many other things that are bad for the consumer.
The argument that Carney puts forward, though, is one I have been making for some time – both on this blog and elsewhere. The argument is a simple one. The more regulation you have in a market the more that regulation will favour the big suppliers – i.e. the larger incumbents – over the smaller incumbents, any potential new players, and, crucially, the individual consumers. So the more that you regulate the worse the situation gets for everyone but “Big Businesses”.
The reasoning is simple to see if you think about it – compliance is expensive. I know this as a simple fact – and my fee base confirms it. It costs money to comply with regulation. The more complex it is the more expensive it gets. For example – complying with the (old) Basel I Accord was a simple exercise. While it took a fair amount of work to get compliance when it first came in in 1988 / 89 most banking systems could comply with it fairly quickly and, importantly, cheaply.
While Basel II is a much better (i.e. risk sensitive) set of standards there is (I think) no-one out there that would claim they are simpler than Basel I. The amount of money that even a small bank following the Standardised / Basic methods of compliance had to spend was a fair leap from the amount they had to spend to spend under Basel I, and the amounts you have to spend to comply with the Advanced methodology (and therefore to get a substantial capital advantage) was many, many times more.
There is simply no way that a small bank or a new player can afford this without the belief that they will become a big bank (and therefore Big Business) as a result.
Even to keep the people needed to continue and upgrade these systems is expensive and needs a high (and profitable) turnover. There is no other way.
All of this means one thing – that more regulation makes it relatively more expensive for small businesses to operate that for large businesses and it imposes high barriers for new entrants into a market. Regulation therefore tends to help big business, not hinder it.
BTW – I should add that a post at catallaxy reminded me to write this one.
18 comments
29 August, 2009 at 08:04
financialart
I understand your view (and that of ‘The big ripoff’). However, a (maybe too simple) question : is there a solution?
I guess the net benefits of good proper regulation outweigh the lower level of competition ? Also, I guess the complexity of the financial industry requires that kind of complicated Basel 2, Solvency, etc. regulation?
Or are the big banks keeping the complex regulation alive to protect themselves of new entrants?
(Don’t hesitate to correct me if I’m missing the point.)
30 August, 2009 at 16:28
Andrew
financialart,
No – you have understood my point well. I believe there is a solution – but it is not in imposing more rigid, centralised regulation.
This is the sort of model I would like to see attempted. It is a truly free banking scenario with banks and regulatory systems able to compete against each other. It has worked in the past and I believe it could work in the future.
The big banks would, I believe, be very upset with any move to free up the market as they have huge vested interests in the current system and it protects them well.
There would, at least at first, be some problems as people got used to the new system but I believe the added competition and long term stability it would bring would be worth it.
31 August, 2009 at 22:03
Alice
Andy,
Is clive any good on time series stats? Is Clive for hire Andy or do I have to be a bank?
1 September, 2009 at 00:36
Andrew
Alice,
Clive, like me, is no longer working directly for a bank. I would think him excellent on time series stats. His address is on the authors page.
1 September, 2009 at 20:21
Alice
Ok thanks Andy – I might drop Clive a line – Ive got some problems – the bloody data has all the usual time series problems with non stionary variables and a structural break in 1975…and positive serial correlation in the errors. I dont know whether to use logs, lags or differances Andy and Im working on GretL. I can run that!!
Im trying to do a simultaneous equation system set…its working – sort of but Im not convinced its right.
2 September, 2009 at 00:19
Andrew
There were a few structural breaks in 1975.
If Clive can’t help let me know. There may be a few more possible options.
2 September, 2009 at 10:40
Alice
That would be good Andy – Im doing something called SUR simulatenous equations and when I chuck the last equation (top decile share in) it goes rampantly mental!!!
2 September, 2009 at 23:05
clive
Hi Alice,
Sounds as if you’re doing econometric modelling.
I have recently built a business-critical forecasting system – for the health industry, which has its own features of interest. However I’ve never done serious econometric modelling and suspect you’d get better advice from people who actually work in that field.
Undeterred, here comes some free comment, worth perhaps no more than its cost…
Given the long time span you seem to be contemplating, zi=log(xi) would be a sensible first step that may stabilise the error variance and diminish your non-stationarity problems.
Given the positive serial correlation, taking the first difference of the series is a sensible idea. i.e. di=z(i)-z(i-1); and then analyse di.
To model structural breaks the general idea is to include an indicator variable as part of your model; thus an indicator equalling 0 for all times before 1975 and equalling 1 for all times after 1975 acts as a model artifact that absorbs and estimates the size of a step change that took effect in 1975. This is admittedly only a simple concept of “structural break”.
Given that you seem to be working with multiple series and equations that link or constrain them (econometric modelling), the modelling of a structural break might require a bit more design to capture which parts of the relationship are considered to have broken and in what manner.
If the break is bad you may not gain much compared with modelling pre- and post-1975 as separate exercises.
HTH
4 September, 2009 at 21:25
Alice
Clive – thanks so much – Ive done that (taken first differences but based on lots of readings I appear to be pulling up perverse results – signs the wrong way around. I was using GretL but managed to get myself a copy of Stata today and Im still feeling my way around that….Ive discovered on Stata that what appeared in GretL to be nonstationary data is actually stationary with a few lagged differences in Stata using augmented Dickey Fuller (and drift).
So it gets more interesting….and I did a whole lot of tests today…thought I saved it and damn well lost it…somewhere. So Im still getting to grips with Stata and have a way to go (probably a long way).
Im suspecting, seeing as Im looking at shares of taxable income over the long haul in Oz (from about 1945) that what works for one decile (as the endogenous variable is not necessarily going to work for all the others as happened in an engle granger cointegration test).
This is not easy so Im veering towards a simultaneous equation model like sure, which may suffice. Im damned if I am quite sure yet how to select the appropriate number of lags. Is that what you call a VEC lag preestimator test would you know?
The break is bad! Ive split the database ready for that and it behaves quite differently post to pre break.
4 September, 2009 at 21:34
Alice
Ive done that too Clive – I have my structural break dummy variable with zeros to 1975 and then 1s.
One other question ? 1951 was a year that spiked up in many time series plots. There was 19% inflation and a huge spike in top decile share….the question is, what to do with these outliers?
Do I need another dummy or can I just put some other value in there?
5 September, 2009 at 16:26
Alice
Jeez – Im starting to sound like some of the students in here….please send me info on Islamic banking products…!!
6 September, 2009 at 23:13
clive
Were 1951 a step-change then, yes, one could have one extra indicator to model that feature. So you might have variables called step_1951 and step_1975 for example.
But your context sounds like a blip rather than a step. A step (“Heaviside function”) is a change in a parameter from one value to a different value which then persists. A blip (“Dirac delta function”) is a parameter with an isolated instance of a different value i.e. =1 in 1951 and zeroes everywhere else.
But this blip_1951 is just a fancy way of saying that the value in 1951 has a license to be whatever it wants to be which is effectively the same as you just putting in another value.
These indicator variables may sound a bit trivial in the above examples but they can do smart things in combination, such as allowing the slopes of regression relationships to jump from one value to another across a break while retaining other aspects of the model.
A whole other issue is outliers – what they mean, what to do with/about them, etc. Most of that is context dependent without easy answers. Actually, that’s where this thread started – Levy distributions etc.
7 September, 2009 at 12:21
Alice
Thanks Clive – it is a blip. I might give that a try….I dont like this Stata so far…its a programming language learning curve that is appearing rather hideously monstrous to me right now…Im going with another dummy for that blip and Im sticking with GretL because at least I can work how to run it easily……Dirac delta function sound good…ill let you know what happens!
26 September, 2009 at 14:27
un-excogitate.org » Blog Archiv » Regulation Hinders the Small Guys
[…] read this article by Andrew over at ozrisk the other week and I’ve been meaning to comment on it but have only gotten around to doing it […]
24 October, 2009 at 18:54
ABOM
An deregulationist like Andrew supporting a genuine free market in money?
http://www.lewrockwell.com/north/north774.html
HaHaHaHaHa!!!
24 October, 2009 at 18:57
ABOM
A deregulationist like Andrew supporting a genuine free market in money?
http://www.lewrockwell.com/north/north774.html
HaHaHaHaHa!!!
I’m drinking again, contemplating how unjust Goldman’s profit result is – dumb well connected Establishment idiots get free $$$s from the Treasury to counterfeit their profits. Why can’t I be allowed to counterfeit too! I have a colour printer, a great image of a couple of US Presidents, and I want to be the first to print a trillion dollar bill to pay off all the unfunded liabilities and also buy a few politicians who can rave about my “trading smarts”. And typing abilities.
24 October, 2009 at 19:16
ABOM
A small inspiration NOT to suck up to Goldman Sacks.
http://mises.org/story/3717
I may die poor. But the Makian Distribution will haunt the Black-Scholes people until the end of time.
24 October, 2009 at 19:17
ABOM
Sacks of ill-gotten gold, that is.