« On wishful thinking | Main | Nothing to fear but... »

August 22, 2015

Comments

gastro george

So execs aren't investing because they are thinking long term? Doesn't sound likely when they're incentivised by next year's share price.

Steven Clarke

This is confusing.

We have Tyler Cowen's Great Stagnation school telling me we are at a technological plateau - and living standards are stagnating because a lack of monetisable inventions on the horizon.

And now you're saying because people can see a lot of monetisable inventions ahead, they're scared of taking the first step, because the second step might hurt them.

Also, I know you've mentioned this before, but 1855-1938 people were happy to introduce new inventions, despite the falling profit rate. Can't people do so again? Do you really think they've 'wised up'? Has human nature really become much more forward looking? Seems out of step with your other views.

Matt Moore

Hm.

Growth is of two sorts.

We can increase labour and capital levels within a given technology. Capital growth is highly correlated to investment. Or we can improve the production technology itself, which is not really much to do with the vast amount of capital investment that goes on, especially as we move away from industrial innovation and towards software innovation. It's a different process entirely - investment in human capital is not captured by these statistics, even though that is the most important kind these days.

The majority of capitalist system growth has NOT come from capital increases - piling brick on brick, but from better ideas ("market tested betterment" is a preferred formulation to "capitalism").

If the rate of physical capital increase is falling because the rate of improvement of the production function is increasing (which is how I interpret your argument), then this is evidence for the optimistic camp.

PS if you haven't yet read McCloskey's latest works, here's a good summary: https://www.youtube.com/watch?v=x-rf66RRtx4

Igor Belanov

"There is, though, a bigger vision here - that robots can give us all a high income and free people from drudgery."

I think this is a bit of a red herring. In the Western world very few people live in 'drudgery' and the ones that do, like those in the Third World, could be lifted out of it if we committed to greater equality of wealth and resources.

The problem with capitalism's use and development of technology is with its pursuit of exchange value rather than use value. Thus resources are poured into consumer goods that offer an often intangible improvement on the previous model, and built-in obsolescence encourages waste in the interests of production. Where we should be encouraging technology is in the realm of production, where the development of technologically advanced, durable products in automated, high-tech environments could radically reduce working hours and curtail the creation of waste and pollution. The pursuit of capitalist growth effectively inhibits this.

Deviation from the Mean

I work in advanced business solutions and I think this article brings out a very real truth about how businesses view investment.

I also think Matt Moore could not be more wrong to make a distinction between brick on brick and 'new' ideas innovation. The distinction does not exist, they are essentially the same thing. what happens is that at some point the quantitative difference results in a qualitative leap.

chris

@ Steven - Schumpeter might be relevant here. He suggested that the bucaneering entrepreneur would be replaced by the bureaucratic rationalist manager. I suspect this might have happened, with the result that the overconfidence that has driven investment and innovation in the past might be lacking now:
http://stumblingandmumbling.typepad.com/stumbling_and_mumbling/2015/06/secular-stagnation-as-wising-up.html
One way out of stagnation could be that investors rediscover their overoptimism - but this might not happen if they remain scarred by memories of the tech crash and 2008 crisis.
@ Gastro George: it's possible that next year's share price will be depressed by fears of future creative destruction. Im' not sure that capitalism is as short-termist as some think:
http://www.investorschronicle.co.uk/2015/08/04/comment/chris-dillow/short-termist-0Y28RStYyDNtvn7c2dhbbO/article.html


Steven Clarke

@Deviation @Matt

I'm guessing Matt is talking about a very important distinction. Between buying more spades, or inventing a mechanical digger. More horse-drawn carts or an internal combustion engine. One way uses a greater quantity of an existing technology, the other improves the technology.

If you think of a Solow growth model:

Y = AF(K,L)

or some variant thereof.

You can increase output Y by having more inputs of capital K and labour L.

Or you increase the productivity by which you use those inputs, A, by having better technology, skills and organisation.

A captures a lot that's difficult to measure, but it alone leads to continued long-term growth.

Matt M

A lot of investment in capital these days as Matt points out above is 'off balance sheet' in human capital, public investment and in the captured positive externalities of investment other firms, often in entirely different industries, are making. Looking at investment into machinery in company A and such doesn't make sense anymore.

I'd also add that like many economists, this Ricardian equivalence/intertemporal optimisation zombie idea is just embarrassing. Do managers really expect technical change to make technology cheaper and so hold back, until x point in the product life cycle? Maybe, but I doubt they can quantify it, compare it to alternative techs, have the balls to wait out their competition and to be honest, doubt they can even even comprehend a world beyond the next quarter as the commenter above says. If this was the case hardly any tech would get off the ground, because nobody would give it demand scale to drive down fixed costs/unit. Generally speaking the early adopters of mobile phones, fracking, cloud computing etc got an edge even if the producers of these ideas made no money.

If people are going to throw around ricardian equivalence, they should show empirics, as the theory of it is illogical if you understand how 30% of people can't find their balance on a bank statement.

Dave Timoney

A quibble: the decline in the profit rate for the UK is magnified by two other secular trends, namely the growth of global competition (notably the US, Europe and Japan) over the period, and the growing capture of surplus value by wages and taxes coincident with the rise of organised labour and the welfare state. You need to discount these to isolate the impact of technology.

It's also worth asking whether the decline in the rate of profit has followed a similar trajectory since 1938. Many studies suggest that it did up until the 1980s, but that the neoliberal era has been marked by an increase in profit rates since then. You could attribute this to increased business confidence, but the lower levels of investment would suggest otherwise. Alternatively, you could attribute it to financialisation, noting in passing that the stabilisation of the profit rate trend between 1885 and 1914 reflected the first period of global financialisation.

On the binary distinction Matt Moore raises re physical vs intellectual capital, it's worth thinking about the nature of software as a hybrid form of capital. For example, it doesn't experience material wear and tear; it is arguably exempt from moral depreciation, because it has no exchange-value (due to licence restrictions) and can be progressively upgraded/refactored rather than requiring replacement; and it acts as a store of intellectual capital independent of labour.

Software is also driving down the unit cost of investment, which may be giving a misleading impression that low levels of aggregate investment are due to a lack of confidence. Part of the problem here is the paradigm of "corporate investment". The exploitation of freeware and the shift from capital expenditure to SaaS (software as a service) means that there is a huge amount of commercial investment that isn't captured in that definition.

Bob

Businesses invest when they are completely swamped with demand.

Deviation From The Mean

I think the kind of innovation or research that leads to a mechanical digger over using more spades is still a brick on brick invention.

But I think the kind of out of the box, left field research that plays around with existing ideas (still brick on brick) and for example, tries to turn a number of tools and separate processes into one machine requires a separate research regime outside the individual firm(s), so for example, state Universities, NASA or state funded research centres. It should be noted that private concerns will make use of this external to the firm regime. I have a relative who used to work in generics in a state funded research facility and at one point most of their funding came from the British Racing Board. (They looked into the idea of injecting cow cells into horses because cows were much more robust in dealing with injury and disease). The point is that the British racing Board utilised a state funded regime of research.

FATE - I work directly in developing software and if by moral depreciation you mean that your product becomes obsolete because a better product comes to market then I would say moral depreciation is a huge issue! And this article I think discusses that very real issue.

Dave Timoney

DFTM - Mmmuch of the software on which modern business depends is actually decades old: the core of Windows, Unix, the TCP/IP stack etc. In aggregate, software has far greater longevity than hardware. The classic example of this would be the banks, some of whose software is over 50 years old. Though this can give rise to problems (e.g. RBS), it highlights a trade-off between the cost and risk of replacement on the one hand and persistent utility on the other. This structural constraint (which is amplified by the commercial desire for "lock-in"), can result in software becoming so embedded (and a business process so dependent) that moral depreciation is weakened.

A further driver of this tendency is that software has a far greater potential to evolve and grow than hardware. A better product coming on the market could mean a wholly new codebase, but it could equally mean the refactoring or extension of existing software. Whereas this has historically been packaged as a replacement, with a one-off fee, the trend now is continuous upgrades as part of a rental agreement. The news that Windows 10 will be the last major release of the OS does not mean that Microsoft's days are over, but that it is moving to a "post-depreciation" model.

Even at the application level, moral depreciation is not straightforward. Commercial package A may encode better business rules than package B, but the codebase may be largely common due to opensource libraries and standard methods. Also, bear in mind that a modern system's value may be largely a reflection of its associated data, and that those data are constantly being updated by users, so they appreciate rather than depreciate.

Modularisation and the separation of layers in software engineering reflects the isolation of business rules as intellectual capital. In other words, traditional moral depreciation increasingly applies only to a part of a software system. Where those rules are generic or opensource, there may be no moral depreciation to speak of.

Deviation From The Mean

"Much of the software on which modern business depends is actually decades old"

It really isn't. I think one issue that I will accept is that introducing new technology can take a long time and by the time you have gone through proof of concept, designing, building, testing, retesting and implementation a new and better product may already have hit the market in the intervening period. But you won't persuade many managers to ditch a costly product just as it is being implemented!

So in some cases businesses have out of date and patched up products because they won't invest in another product because the next version may be even better!

I will give an example, imagine a company has version 7 of a product and even though this product works and meets its current needs there are a number of issues they would like to see addressed. The feedback from the software provider is that they will try to address these concerns by version 9. In the meantime version 8 is coming to market and naturally is an improvement on version 7. But the company will decide to stick with Version 7 rather than go to version 8 because they expect that version 9 will address their particular issues.

We are living in the age where moral depreciation sets in even before the product goes live!

Stigand

Brilliant post. I think it becomes even clearer if you think of what's happening in terms of spillovers.

If an investment has high spillovers, there's a benefit to being a "fast follower" rather than a first mover. (Think of Google or HTC in the smartphone market: they benefited from Apple's investment in the iPhone, in that it created a market and a product type that they could then play into.)

I suspect that $1 of investment today generates more spillovers than $1 of investment in 1950 (since more investment is intangible and intangible investment seems to have more spillovers than tangible investment: it's easier to keep your competitors from using your new factory than to keep them from using your new business model).

Note that you don't need to believe anything about the speed of technological innovation for this to be true.

gastro george

DFTM and FATE. In my experience, life amongst software users is even more complex than you describe. For example, execs may wish to upgrade to a newer technology, but find that their business rules and procedures are tied up in their existing technology to a far greater extent than they first imagine. So replacing an existing software suite with a new suite, or even the latest version, can represent huge costs in remodelling of business processes, re-education of staff, etc. That's before you start talking about cycles of testing which major corporations will have in place before even minor upgrades/patching. It's not a coincidence that the NHS has a large cohort of PCs still running XP (and that's not trying to make a private/public point).

It's also true that there is a model of software development in the US that essentially abandons the concept of future development or even maintenance. It's a cash-in on immediate demand and move on to the next product model.

Dave Timoney

DFTM, I think we should clarify what Marx meant by "moral depreciation". To that end, consider the part of quote hidden by Chris's ellipsis: "In both cases, be the machine ever so young and full of life, its value is no longer determined by the labor actually materialized in it, but by the labor-time requisite to reproduce either it or the better machine. It has, therefore, lost value more or less. The shorter the period taken to reproduce its total value, the less is the danger of moral depreciation; and the longer the working day, the shorter is that period."

The point is that the exchange-value of a machine (hardware) will decline even if it isn't in use and thus subject to either wear and tear. This is because the cost of replacing the machine will fall due to technological progress, which either makes the production of similar machines cheaper or creates new types of machine that fulfill a similar function at a lower unit cost.

What Marx meant by "reproduce its total value" was the transfer of labour value via the machine to its products - i.e. a £100 machine produces 100 widgets at £1 a pop. The danger of moral depreciation evaporates once this breakeven point is reached, hence there is an incentive to exploit labour at a faster rate. However, many economists would argue that the capitalist already has plenty of incentives to exploit labour and that moral depreciation is a marginal worry at best.

The brings us back to Matt M's comments on the "Ricardian equivalence/intertemporal optimisation zombie idea". Do we believe that capitalists are pessimistic about investment because they are optimistic about technological development? If there is an inverse correlation, this would imply that periods of high investment reflect low expectations, which doesn't seem plausible. I think a more plausible explanation is that the cost of investment is falling and this is in large part due to the peculiarities of software.

Re your specific example of version-skipping, if moral depreciation were that strong a force, everyone would upgrade to version 8 as soon as possible. That they don't reflects the trade-off between the cost of change and expected utility. There are 3 factors at work here: 1) The upgradability of software means that some of the gains of new techniques can be applied without full replacement (i.e. you go from 7.0 to 7.6 rather than 8.0). 2) The falling cost of investment (i.e. brand new software) is offset by the growing cost of process change, as GG notes. 3) The lack of wear and tear means that the software's useful life is extended well beyond any depreciation schedule (hence all the PCs still running the 14-year-old XP).

ChapWithNoName

To what extent does or should patent law and IP protections redress this, and if insufficiency then what could be done to help? From an industry wide perspective innovation may not lead to increased profits due to competition, but from an individual company perspective surely higher returns can be made in the short run from innovation, either by increasing margins or growing market share?

aragon

Chapwithnoname, No IP and Patents are obstructions to innovation, like the patent thicket.

Chris, was arguing from a consumer of innovation rather than an innovators perspective.

Technological Deflation.

We have had technological deflation in the PC world for decades, Moores law (1975 to current), where next (two) years product will be significantly better, than the current product. Applies to chips and variations apply to networking equipment.

https://en.wikipedia.org/wiki/Moore's_law

Prices are sticky PC prices tend to stay stable, or decline slowly while functionality improves, which suggests no half price robots. Some robots may be made redundasnt by new more functional robots, but many will continue to perform, with extra functionality expanding the envelope, and perhaps replacing people.
e.g Baxter

http://www.rethinkrobotics.com/baxter/

We may be at a point of inflection but people cannot delay purchasing decisions for long. You might implement a rolling program of invest to expand the scope of your robot work force. But you have customers to service now.

Financialisation.

General Electric was the poster boy for financialisation.

http://www.nytimes.com/2015/04/11/business/dealbook/general-electric-to-sell-bulk-of-its-finance-unit.html

"The expansion of the finance business was part of the broader diversification of G.E. starting in the 1980s under Mr. Welch, when American industry feared it might not be able to compete against Japan."

"For years, the financial tilt looked smart and relatively easy. In an interview in 2010, Mr. Immelt recalled, if a deal looked like a moneymaker, it got the nod. “And you don’t have to build a factory,” he said.

Yet the big bet on finance badly wounded G.E. after Lehman’s demise, when the market upheaval left the conglomerate hard-pressed to borrow debt for its day-to-day operations."

*** “And you don’t have to build a factory,” ***

Innovation.

Innovation drives investment, look at the Google Car (Self Driving), or Tesla (Electric Car/Batteries), the first move advantage is huge, with Apple, Uber, and countless Universities and DARPA undertaking research. Only Tesla makes cars, and then only on a small niche scale, but the large German car companies are scambling to catch up, or risk been screwdriver plants for the technology companies.

IP and Patents obstruct the process of innovation, it is only the cash cows, RIAA, MPAA, Drugs companies etc, who like IP, not the innovators.

nick ford

Chris highlights an interesting issue regarding how firms (and consumers) make decisions about holding off making particular investment decisions because of improving technology. However, I don't think he demonstrates conclusively that there are any malign consequences of this behaviour. It is surely rational and economically efficient to take into account future technical improvements, before making an investment decision. If this didn't happen a lot of investment would be wasted.
I think it is far from clear that it is too little investment that is the principal impediment to productivity growth in the UK, as Chris seems to assume. The advocates of 'additional investment' have caused us to invest billions more on higher education; it is far from clear this has improved productivity because it may be being badly spent and not giving people sufficient economically useful skills. Similarly, the massive investment in infrastructure such as high speed rail is likely to be largely wasted.
Rather than jump to the conclusion we need to look for ways to promote the volume of investment, we need to better understand why productivity growth has slowed, if indeed it really has.

The comments to this entry are closed.

blogs I like

Blog powered by Typepad