Science, technology & innovation – Economics Observatory https://www.economicsobservatory.com Mon, 12 Sep 2022 12:00:46 +0000 en-GB hourly 1 https://wordpress.org/?v=5.8.5 How can new financial technologies help to tackle social exclusion? https://www.coronavirusandtheeconomy.com/how-can-new-financial-technologies-help-to-tackle-social-exclusion Mon, 12 Sep 2022 07:46:59 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=18940 Fintech refers to ‘technology-enabled innovation in financial services that could result in new business models, applications, processes, or products with an associated material effect on the provision of financial services’ (Financial Stability Board, 2017). The UK is home to a dynamic ecosystem of fintech organisations. These include new challenger banks (such as Monzo and Starling) […]

The post How can new financial technologies help to tackle social exclusion? appeared first on Economics Observatory.

]]>
Fintech refers to ‘technology-enabled innovation in financial services that could result in new business models, applications, processes, or products with an associated material effect on the provision of financial services’ (Financial Stability Board, 2017).

The UK is home to a dynamic ecosystem of fintech organisations. These include new challenger banks (such as Monzo and Starling) and scale-ups that make money transfers faster and cheaper (for example Wise) and legal compliance easier (for example Cube). The application of technology in finance is not new, but the current wave of innovation is notable for its scope and scale.

Artificial intelligence and machine learning, cloud computing, open application programming interfaces and blockchain are among the technologies with the greatest impact. They are changing the ways in which financial service providers operate, communicate and engage with consumers and other stakeholders. New fintech applications are mobile-first, customer-centric and disruptive to previously unchallenged parts of the finance sector.

Is fintech benefiting everyone?

A central claim of many financial technology firms is that they provide new ways in which to tackle financial exclusion, that is, the ‘inability, difficulty or reluctance to access mainstream financial services, which, without intervention, can stimulate social exclusion, poverty and inequality’ (House of Lords Liaison Committee, 2021).

Being excluded makes life difficult in today’s highly financialised society. Access to a bank account and other basic banking products is a de facto requirement for most forms of accommodation, quality jobs or receiving welfare. But while consumers increasingly embrace digital alternatives for basic banking services, uptake of solutions that could have a greater impact – particularly among excluded and otherwise vulnerable consumers – has been slow.

Research exploring the use of fintech by financially vulnerable consumers shows that, for fintech to be more socially productive, entrepreneurs and policy-makers must improve both access and trust.

What opportunities does fintech offer to financially vulnerable and excluded individuals?

Traditionally, financial services firms have relied on brick-and-mortar branches and rigid legacy technology systems that are inefficient and costly to operate. These inefficiencies were often amplified by governance processes that require the completion of a series of time-intensive, manual tasks.

Recent fintech innovations have changed this, for example, by creating fully digital banking experiences and by implementing artificial intelligence (AI) to automate searching, matching, comparing, filling forms, reviewing and other rules-based back-office activities (Ashta and Herrmann, 2021). This type of automation leads to cost reductions that have the potential to make financial products and services more affordable to low-income consumers (Philippon, 2019).

In addition to increasing efficiency, open finance and AI can significantly improve the quality of debt advice services by providing a holistic picture of a customers’ financial situation. Machine learning algorithms can analyse large quantities of financial and non-financial data and potentially uncover patterns or early signs of vulnerability that humans might not be able to identify (Azzopardi et al, 2019).

These insights can help advisers to improve the accuracy and timing of their recommendations to customers. Similarly, financial institutions can use insights from open data and machine learning to provide personalised products and services that can improve the financial wellbeing and resilience of customers.

Other AI applications help consumers to identify opportunities to reduce expenditure and maximise their income, for example, by providing income-smoothing options (services that turn unpredictable income streams into regular payments by identifying a customers’ average earnings and balancing spikes or dips).

They can also assist by identifying benefits eligibility or offering automated money guidance. One of the most successful Scottish financial inclusion fintechs, InBest, has developed a platform that integrates these services to help vulnerable consumers to improve their situation and build up financial resilience.

Combining open data with AI and machine learning also enables fintech firms to use new approaches to credit scoring and risk assessment (Bazarbash, 2019). These approaches are potentially more transparent and do not rely solely on credit history. They can therefore provide easier access to credit for people with no or limited credit history (Jagtiani and Lemieux, 2017).

Besides directly addressing excluded or vulnerable consumers, fintech can have indirect effects on financial poverty by increasing productivity and fostering sustainable economic growth (Appiah-Otoo and Song, 2021; Song and Appiah-Otoo, 2022).

For example, financial technology tools for payments, accounting, cash flow management, smart contracts and other business functions can help small and medium-sized enterprises (SMEs) to increase productivity and build up competitive advantages (for example, based on reduced cost of capital, improved operational efficiency or increased liquidity). This creates opportunities for quality employment within and outside the fintech space.

Fintech can also positively contribute to financial inclusion, resilience and wellbeing through government services. Digitising government services can make the distribution of stimulus packages or financial aid much more efficient. In April 2020, roughly 7.4 million consumers in the United States opened PayPal accounts to enable faster receipt and cashing in of their economic impact payments (EIP) that were part of the Covid-19 relief efforts. This is one example of the potential of government-fintech collaboration.

Why have we not yet seen the expected results?

Financial technologies have the potential to help marginalised communities, yet progress has been slow. Our research indicates that there are two main barriers that can limit the ethical and equitable application of fintech for financial inclusion. Policy-makers and entrepreneurs should take these into account as they encourage further activity in this area.

First, there are issues around access. Some of the most vulnerable financially excluded groups in developed countries lack access to even the most basic information communication technologies. Even where individuals own mobile phones or have access to a personal computer with broadband internet, there remain underappreciated hurdles relating to ‘data poverty’ that restrict access to online services.

Data poverty occurs where disadvantaged groups cannot afford to purchase enough data to access online services, thus excluding them from the full range of financial services. This data poverty can be especially pronounced in rural areas, where residents can have less reliable 4G and 5G phone signals.

This research also shows that vulnerable consumers often feel excluded from existing fintech services as they do not have sufficient financial literacy to make sense of new products and services. Attempts to address this issue by ‘educating’ vulnerable consumers are often seen as patronising and can disengage users by putting them into boxes they don’t see themselves in. The complexity of technical jargon and the overuse of buzzwords also act as significant barriers to engaging many vulnerable groups.

A lack of trust is the second major barrier limiting the extent to which fintech is addressing financial inclusion. Research highlights that some disadvantaged groups are wary of new fintech services that are designed specifically to help them.

For example, this study found resistance to a new service that used an innovative algorithm to maximise government welfare benefits for claimants. This was viewed with suspicion by many potential users despite appearing to be a beneficial service. In particular, vulnerable groups were concerned that providing more information to government agencies and their intermediaries could result in them losing money or otherwise being reprimanded for the information they disclosed.

Given that many fintech solutions rely on large quantities of user data to function, the withholding of important information could undermine the viability of services for disadvantaged groups, leading to even greater marginalisation.

The study also showed that individuals can conflate the use of legitimate digital financial services with an increased risk of online fraud and exploitation. Many older communities, and other vulnerable user groups, generalised that ‘most online services are a scam’ and therefore all digital services are better avoided.

There is general inertia around moving away from physical currency, as cash is perceived as a lower risk. Conflicting expert advice (share your data to get better products and services versus don’t share any data to avoid being exploited), as well as complex public debates around questionable data practices – for example, the Facebook-Cambridge Analytica scandal – make it even more challenging for non-expert consumers to judge the legitimacy of fintech solutions without any form of trusted guidance.

How can fintech overcome remaining barriers?

Financial technology holds promise for addressing social exclusion, but there are still barriers from a user perspective. Policy-makers have an important role to play in bridging these supply and demand-side issues that are currently holding back progress.

A first step in this direction could be the development of a set of principles guiding how fintech products and services are developed for marginalised, vulnerable and excluded groups. If widely adopted these could give those groups confidence that financial products and services were ‘safe’ to use. They would also ensure accessibility for a wider community.

We identify the following six principles for those developing fintech solutions for financially excluded groups.

  1. Explainability: technologically augmented decision-making affecting vulnerable groups should be fully explicable and auditable. There should be quick, easy and independent means available to challenge potentially unfair decisions.
  • Bias mitigation: fintech developers should evaluate potential direct, indirect and intersecting biases when building products and services for marginalised user groups. Mitigation measures should be transparent and comprehensible to consumers and supporting third-sector organisations.
  • Dignity: where possible, innovations should be created with – not for – users. User-centred and co-creation design tools should be adopted to improve the legitimacy and adoption of new innovations.
  • Business model transparency: fintech ventures working with marginalised groups should be transparent about how revenue is generated, particularly where there is monetisation of user data or customer service fees and interest charged.
  • Lightweight and non-obsolescent technologies: fintech entrepreneurs should build technological solutions that require minimal data usage and work on older hardware and operating systems.
  • Accessibility and navigability: products and services should have only necessary functionality, be accessible for physically and cognitively impaired individuals and should adopt regionally appropriate variations of the internet crystal mark, which denotes clear use of language.

A common standard based on these principles, or some variation of them would be a productive way of addressing the concerns many vulnerable groups have around adopting new financial technology innovations. Indeed, the adoption of these guiding principles could help all fintech firms – not just those that specifically address vulnerable consumers – to become more socially productive.

Where can I find out more?

Who are experts on this question?

  • Christine Oughton
  • Sian Williams
  • Karen Elliot
  • Thomas Philippon
Authors: Felix Honecker, Dominic Chalmers and Nicola Anderson
Authors’ note: We would like to thank the members and guests of the FinTech Scotland Consumer Panel and the organisations they are affiliated with.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sk?odowska-Curie grant agreement No 860364. This article reflects only the author’s view and the agency is not responsible for any use that may be made of the information it contains.
Photo by lucigerma from iStock

The post How can new financial technologies help to tackle social exclusion? appeared first on Economics Observatory.

]]>
How does uncertainty about the future affect climate change policy? https://www.coronavirusandtheeconomy.com/how-does-uncertainty-about-the-future-affect-climate-change-policy Tue, 16 Aug 2022 00:00:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=18675 Uncertainty is all around us. Tsunamis; the systems-wide repercussions of a global pandemic; ecosystem collapse following the loss of a keystone species; or even the surprise win of a presidential candidate. These are all examples of unexpected events – and we are usually poorly prepared for how such events will ripple through society, the environment, […]

The post How does uncertainty about the future affect climate change policy? appeared first on Economics Observatory.

]]>
Uncertainty is all around us. Tsunamis; the systems-wide repercussions of a global pandemic; ecosystem collapse following the loss of a keystone species; or even the surprise win of a presidential candidate. These are all examples of unexpected events – and we are usually poorly prepared for how such events will ripple through society, the environment, the economy and our everyday lives.

This is also true for climate change. It is certain that climate change is happening and is driven by human factors (Intergovernmental Panel on Climate Change, IPCC, 2014). But its inherently complex nature makes it less clear what the impacts will be – including when and where all of them will happen, and to what degree.

The uncertainty of future climate policies, greenhouse gas emissions, complex climate and socio-economic feedback loops, and unknown tipping points all further complicate our projections. For example, it is unclear to what extent warming oceans will affect global fish supplies, and how these changes may affect the broader food system and national economies. Similarly, the impact that heat waves may have on human health and labour productivity is uncertain.

We cannot predict the future with precision, but that does not mean we cannot or should not prepare for it. Acknowledging that uncertainty is inherently present in the world and affects our decision-making is a crucial first step. Understanding that different types of uncertainty exist, and how to approach them, comes next.

These include scientific uncertainty, which is the lack of exact knowledge, regardless of why this knowledge deficiency exists. Deficit uncertainty, for example, arises from a lack of accurate models, ignorance, biases and measurement errors. It can be minimised through technology and learning.

Complex uncertainty involves different interdependent factors and patterns of which we are both aware and unaware, and we are likely not to understand fully how they affect one another (Tye and Altamirano 2017). It is therefore much harder to account for.

When both deficit and complex uncertainty are at work – as they are when dealing with the climate crisis – another layer of unknowns is added, resulting in deep or cascading uncertainty.

Figure 1: Areas of climate uncertainty

Source: World Resources Institute

Climate adaptation and mitigation

Approaches to tackling the climate crisis are often split into two camps, with blurred lines between the two. The first – mitigation – focuses on reducing greenhouse gas emissions that are the root cause of the problem. The second is the process of helping societies prepare for and adapt to current and future climate change impacts – adaptation.

A third camp – climate resilience – is intended to be a bridge, acknowledging the many synergies and co-benefits of addressing the two together, but a full merger has not yet been achieved in practice.

Uncertainties abound with both mitigation and adaptation. But these are more salient in decision-making for climate adaptation because it is much less quantitative in nature. For mitigation policies, scientists can calculate the quantity of emissions in the atmosphere, know what is being emitted by different countries and industries, and come up with informed projections of future emissions paths. They can also develop recommendations on how to reduce emissions more effectively for each sector and estimated costs on how to reach targets.

On the other hand, it is much harder to understand and attribute the miniscule and systemic ways in which the climate crisis affects people, ecosystems and sectors like agriculture. For example, how much of an incoming drought is worsened by the climate crisis, and how much can be attributed to other factors? Or what will the pattern of infectious, vector-borne diseases look like in 20 or 50 years, and what will be the full health and social consequences of these phenomena? These are questions with fuzzy and incomplete variables and answers.

Yet despite these challenges, approaches have been developed to navigate uncertainty in climate mitigation and adaptation.

How can policy-makers navigate uncertainty to mitigate carbon emissions?

Often simulations (or models) help researchers and policy-makers to analyse quantitatively the climate policy and actions needed to understand uncertainty. A key aspect of these models is that they explore potential scenarios.

They can identify a range of options for reaching long-term climate mitigation targets and indicate what kinds of investments, actions and policies can enable the transition. Models do not predict precise future outcomes. Instead, they analyse possible futures states of the world (or scenarios) from which policy implications can be drawn.

Building a scenario analysis to explore how the climate crisis affects people and places over time is a challenging task. One of the biggest difficulties lies in selecting and building plausible future scenarios.

There is still great uncertainty over the future course of climate policies, climate impacts and other factors that materially influence businesses and investments. Uncertainty permeates scenarios and models. Often, researchers and planners analyse past and present emissions and try to forecast future ones. From this, they identify the optimal plan or strategy for the perceived circumstances.

This can be risky where uncertainty is significant, as with climate change. If the analyses and forecasts turn out to be seriously wrong, the outcome could be devastating and entail irreversible losses. For example, many insurance companies became insolvent after Hurricane Andrew – which struck the Bahamas, Florida and Louisiana in 1992 – largely because the risks had been underestimated.

Uncertainty can be accounted for in modelling studies in many ways. The most frequent method is to look into uncertainty about input indicators (for example, GDP growth, impacts from climate, and discount rate – the interest rate used to determine the value of future cash flows) that can be addressed by including a sensitivity analysis of the key inputs for the scenarios modelled.

This analysis is done by including different assumptions for single key scenario inputs and seeing how the results change. For example, by varying the discount rate (reflecting how society value future costs and benefits), the assessment can produce different results and hence different policy implications.

More advanced techniques, such as ‘decision-making under deep uncertainty’, take a different track. Instead of seeking the optimal option under a certain future scenario or just reporting how results change when varying one parameter at a time, this approach seeks to find robust options across diverse possible future scenarios. This involves testing hundreds, thousands or even more scenarios that are constructed by identifying material uncertain factors and assigning varied combinations of values to these.

This method provides flexibility in the decisions that policy-makers make: instead of choosing one scenario that provides an optimal solution for one future state of the world, they would choose the pathway that will perform well over many different scenarios (which are uncertain).

How can policy-makers navigate uncertainty to adapt to climate change?

Two of the main frameworks used by planners, policy-makers and adaptation practitioners to plan for and identify adaptation measures are resilient and adaptive approaches.

The resilient approach works well over a variety of possible outcomes, and involves techniques such as robust decision-making (RDM). This is an analytic methodology for planners that begins with a decision to act, and then looks at climate models, socio-economic data and other relevant information to identify the best strategy over a variety of future scenarios.

For example, RDM probability calculations could determine that an extensive drought in a specific region may last four to eight months, with the latter being more likely. Planners might then undertake drought preparations for six months or more considering these forecasts and be ready for multiple scenarios as a result.

An adaptive approach is more flexible. It usually responds to triggers and can be modified in real time as events unfold. One technique under this approach includes iterative risk management (IRM), a participatory technique that allows for flexible and reversible decision-making even when risks and thresholds are unclear.

For example, planners took an IRM approach to the Thames Estuary 2100 Project – a response to the region’s 1953 flood catastrophe, which caused a heavy loss of life and property. More than a million people live within the Thames floodplain and the area comprises about £200 billion in property in and around London. So planners developed a system that allows river barriers and defences to be raised or lowered at any time, providing flexible protection against sea level rises and tidal flooding.

How do different uncertainties affect policy design?

Future reductions in greenhouse gas emission depend on what technologies and innovations become available, how quickly they are deployed and scaled, and whether they are widely accepted by society. For example, the UK’s long-term strategy sees the future roles of electrification and hydrogen fuel in the building and transport sectors as sources of uncertainty. Similarly, the United States identifies the growth of clean vehicles (for example, electric vehicles and fuel cell vehicles) as one important uncertainty.

With climate adaptation, integrating uncertainty into planning and policy design is crucial because it improves the robustness of adaptation strategies and will affect how resources are allocated and distributed over the short, medium and long term.

Governments can systematically expand the multiple scenario approach explained above by ‘stress-testing’ their climate strategies against the different scenarios. Stress-testing is a risk management tool that involves analysing the impacts of extreme scenarios that are unlikely but feasible. This approach is similar to the stress tests that banks widely introduced after the global financial crisis of 2007-09 to examine their vulnerabilities to external shocks such as a stock market crash or a severe economic downturn.

The idea is the same. Repeating the stress tests for different strategies will help countries to identify the most important uncertainties to address and robust options to include in their long-term strategies that would perform well across many different scenarios.

For example, in one analysis, we stress-tested three policy packages for a hypothetical country’s long-term strategy against 1,000 scenarios with different assumptions of future cost reductions in low-carbon technologies.

In this demonstration, the stress test may suggest that the cost reduction of electric vehicles was the most critical uncertainty across scenarios. This is because, although costs of electric vehicles have been decreasing in recent years, it is uncertain whether they will compete (in terms of price) with petrol or diesel cars.

The analysis also indicated that policies to boost electric vehicle sales would reduce uncertainty in the hypothetical country's emissions outcome in 2050. For a real country, the most important uncertainties and the most robust policy options may differ, but the stress test process would be similarly useful for identifying robust policy options.

It is clear that uncertainty affects decision-making in climate mitigation and adaptation in numerous ways, and that scientists and policy-makers rarely have perfect information or knowledge. But this does not mean that decisions and actions should be put on hold.

Reducing emissions and protecting people and ecosystems from climate impacts remains an imperative, and increasingly sophisticated tools continue to be developed to take account of uncertainty and face it head-on.

Where can I find out more?

Who are experts on this question?

  • Juan Carlos Altamirano, economist at World Resources Institute
  • Ichiro Sato, Executive Senior Research Fellow, JICA
  • Robert Lempert, Director, Frederick S. Pardee Center for Longer Range Global Policy and the Future Human Condition; RAND Corporation
  • Stéphane Hallegatte, Senior Climate Change Adviser, World Bank
Authors: Stefanie Tye and Juan Carlos Altamirano
Photo by oobqoo from iStock

The post How does uncertainty about the future affect climate change policy? appeared first on Economics Observatory.

]]>
The price of everything https://www.coronavirusandtheeconomy.com/the-price-of-everything Fri, 13 May 2022 08:39:12 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=18152 Newsletter from 13 May 2022 Policy-makers around the world are scrambling to deal with the effects of inflation and the growing cost of living crisis. Last week, the Bank of England raised interest rates for the fourth time in a row and its counterpart central bank in the United States, the Federal Reserve, announced the […]

The post The price of everything appeared first on Economics Observatory.

]]>
Newsletter from 13 May 2022

Policy-makers around the world are scrambling to deal with the effects of inflation and the growing cost of living crisis. Last week, the Bank of England raised interest rates for the fourth time in a row and its counterpart central bank in the United States, the Federal Reserve, announced the biggest interest rate hike in over 20 years.

New data show that the UK economy contracted in March and growth looks set to be stifled for the rest of the year as inflation squeezes living standards across the country. How policy-makers navigate this threat of ‘stagflation’ depends partly on what is valued, how different goods and services are priced and what sacrifices are made.

High prices, low rates

With inflation at its highest level in three decades, and predicted to climb as high as 10% later this year, Huw Dixon (Cardiff Business School and a lead editor here at the Economics Observatory) this week laid out how inflation affects the economy when interest rates are near zero.

The Bank of England’s Monetary Policy Committee has set very low nominal interest rates since the global financial crisis of 2007-09, so the recent spike in inflation has just extended the longest period of sustained negative real interest rates in UK history (see Figure 1).

Figure 1: Nominal and real interest rates (2009-2022)

Source: Bank of England

Large-scale asset purchases or quantitative easing by the central bank have also helped to keep real interest rates low in a bid to stimulate the economy. But the most important effect of inflation, Huw argues, with these conventional and unconventional monetary policy measures in place, is the redistribution of wealth from savers to borrowers.

The most notable beneficiary of this transfer is the government. As the biggest borrower in the UK, (with debt equal to around 100% of GDP), inflation can be seen as a tax on government bond-holders. As a result, if inflation reaches 10%, the inflation tax will be equivalent to about 10% or more of GDP. Households bear the brunt of this, although Huw points out that those with mortgages also benefit from the real value of their debt being eroded.

The 1970s offer a lesson here: when inflation is prolonged, it becomes entrenched and difficult to reduce. In such circumstances, as now, UK central bankers face a difficult decision. Raising interest rates further will weaken government finances. It will also be unpopular with those who have benefited from high house and share prices, and could contribute to a recession.

On the other hand, not acting would erode confidence in the Bank of England’s ability to control inflation – something it has worked so hard to achieve in the quarter of a century since its independence (as reported in an Observatory piece last week by Jagjit Chadha of the National Institute of Economic and Social Research, NIESR, another of our lead editors).

Basket cases

When the Office for National Statistics (ONS) calculates the rate of inflation each month, it uses a ‘basket’ of 700 good and services and measures changes in their price. As pointed out by Diane Coyle (University of Cambridge and another of our lead editors) in a new piece this week, social media and internet search are examples of products that are not included in this basket. As free services that we don’t directly spend money on, they cannot be included in price data. Instead, we pay for them in terms of our time and our data.

In general, the rise of the internet has been a good news story for inflation, as many of the products that we used to pay for are now free. The price of telecoms – such as bundles of calls, texts and data services – have also fallen significantly, for example.

But zero prices for much-valued free digital apps do not tell the whole story. Advertising opportunities and data collection are needed to keep these services free. Universality and affordable access are also important challenges for policy-makers. For example, there is evidence that children in lower-income households in England had worse access to online lessons during school closures as a result of the pandemic – a problem anticipated by some of the early pieces on the Observatory in mid-2020.

The cost of lockdown

Lockdown learning is explored further by Per Engzell, Bastian Betthäuser and Anders Bach-Mortensen (University of Oxford) in their piece for the Observatory this week. Educational data are less widespread and timely than many other sectors of the economy. But the authors highlight an early and influential study from the Netherlands, which finds that students learned very little during the first eight weeks of home schooling. Students whose parents were less well educated also suffered the largest setbacks. 

A systematic review of the current evidence also finds that students lost out on around 43% of a school year’s worth of learning, on average (Betthäuser et al, 2022). This study suggests that the worst learning deficit occurred early in the pandemic.

And yet, the research indicates that there are some reasons to be optimistic even after two years of interrupted schooling. In particular, the learning deficit can be undone and long-term consequences prevented, as long as decision-makers show enough resolve.

Just as with the challenge of the escalating cost of living crisis, making up for learning losses caused by Covid-19 remains paramount. Children and young people need to be equipped with the skills necessary to enter the ever-turbulent world of work. Policy-makers must continue to recognise and mitigate the hidden costs of Covid-19. Now as much as ever, knowing both the price and the value of everything is vital.

Observatory news

  • ESCoE Conference on Economic Measurement (25-27 May), University of Strathclyde. On Thursday 26 May, the Economics Observatory will be running a two-hour workshop at the event to highlight the advantages and some caveats of using data visualisation as the main channel for communicating economic information to a wide and diverse audience. This workshop will include a 'code along' where participants can learn to create an interactive chart like those on our data hub. Registration for the conference at the University of Strathclyde is open here.
  • ECO Collection: Scottish Independence. We recently launched our first printed collection Scottish Independence  Economic questions and research evidence. A digital version is available to read online.
  • UCL Stone Centre. Our colleagues at University College London are running an event to celebrate the launch of the UCL Stone Centre (on Wealth Concentration, Inequality, and the Economy). This will be held at Church House, Westminster on 26 May 2022 at 5-7pm, followed by a reception. The event will feature a keynote lecture in-person by Nobel laureate James Heckman, on policies around social mobility and human flourishing. This will be followed by a panel discussion on inequality and firms chaired by another Nobel laureate Sir Angus Deaton (also in-person). The registration link is here for both in-person and online.
Author: Ben Pimley
Picture by Viktoria Korobova on iStock

The post The price of everything appeared first on Economics Observatory.

]]>
How do free digital products affect prices and living standards? https://www.coronavirusandtheeconomy.com/how-do-free-digital-products-affect-prices-and-living-standards Wed, 11 May 2022 00:01:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=18019 Many of us use a smartphone every day for things that would have previously required all kinds of additional products that we would have had to pay for – including watches, diaries, maps and cameras. We also use them for newer services such as social media and online search. But how do these free digital […]

The post How do free digital products affect prices and living standards? appeared first on Economics Observatory.

]]>
Many of us use a smartphone every day for things that would have previously required all kinds of additional products that we would have had to pay for – including watches, diaries, maps and cameras. We also use them for newer services such as social media and online search. But how do these free digital products affect inflation and living standards?

Since the launch of the Apple iPhone in 2007, the spread of 3G and beyond mobile networks, and the development of software for apps, smartphone use in the UK, as elsewhere, has become widespread. Ofcom (the UK’s regulator for all communications services) reports that in 2020, 85% of internet users over the age of 16 were going online using a smartphone (see Table 1). Further, these devices account for just over two-thirds of the five hours plus that the average user spends online.

Table 1: Devices used to go online, among those who go online, by age group

16+ internet users16-2425-3435-4445-5455-6465+
Smart
phone
85%87%91%93%92%85%59%
Computer74%68%66%71%78%77%87%
Tablet51%37%48%56%55%54%56%
Smart TV41%42%46%50%43%34%26%
Games
console
21%43%30%26%17%5%1%
Smart speaker20%23%22%24%25%16%11%
Wearable tech12%12%17%16%13%8%4%
Only use devices other than a computer to go online26%32%34%29%22%23%13%
Only uses a smart
phone to go online
10%12%15%13%8%8%2%
Source: Ofcom Adults’ Media Literacy Tracker 2020

People have to pay for their data plans (which in the UK usually include the handset), and for some subscription services, such as paid-for streaming services like Netflix or some magazines. But much of what can be accessed either by smartphone or other devices is ‘free’. Some of the functions and apps are entirely free – such as the camera or calculator – while others we pay for with attention to adverts, or with personal data, rather than money.

Items that we stop purchasing are removed from the ‘basket’ of goods used to calculate the Consumer Price Index (CPI). When the amount spent on them out of total budgets is so low, they get no weight in the measure of inflation. For example, 2022 saw the removal of reference books such as road atlases and dictionaries, with the Office for National Statistics (ONS) commenting: ‘The rise of the internet has seen the need for reference books fall, all age groups are resorting to electronic maps to plan journeys etc while dictionary and thesaurus apps are also available’. These substitutions are good news for consumers, allowing people to spend their money on other items.

What about the money that we do have to pay to access apps and services? Over time, the price of telecoms services has declined, but its measurement is complicated. As consumers pay for bundles of voice calls, texts and data services in packages, it is difficult to compare. The bundles might also include fixed line charges and access charges, as well as (part) payment for the device itself.

In practice, the lowest price of a bundle that specified types of consumers would choose is used in constructing a price index. Recent work on the price index for telecoms services has found significant declines in prices over recent years, with the extent of the fall depending on how the complexities are treated (Abidrahman et al, 2020 and 2022 forthcoming).

The ONS has adopted the option, showing a fall of over a third in telecoms services prices since 2010 (see Figure 1). This price fall tallies with a surge in the volume of data used.

Figure 1: Current and improved telecoms services deflator, 1997 to 2016

Source: ONS, 2020

When it comes to the range of apps and services for which we pay in attention or data, there is no settled way to estimate the benefits that people get from them. The zero price for a social media app or bundled free apps does not feature in the CPI, precisely because we are not spending money on them.

One approach to ascertaining the consumer value of free goods, widely used in environmental economics, is to use either an experimental approach or a carefully structured survey to find out how much money people would require to give up access to one of the free goods for a period of time.

Their ‘willingness to accept’ (WTA) loss of the app or service is a measure of the total benefit (or ‘consumer surplus’) that they gain from using it. This differs from a market price, which is lower than the total consumer surplus (because there are always some people who would have paid more if they had to).

A number of studies have tried this approach, with varying results. In the United States, research has focused on Facebook. One study found that the median US Facebook user needed around $37 to give up the service for a month – although just $322 to give up ‘all social media’ for one year (Brynjolfsson et al, 2019). Another found a median annual figure of $59 WTA loss of Facebook (Sunstein, 2018), while other research has found that it was over $1,000 (Corrigan et al, 2018).

One UK-based study found a wide range of median annual WTA values for different free goods, with search (£1,500) and personal email (£3,500) the most highly valued categories. In comparison, Facebook and many free apps had much lower values, at £150 to £10 a year, respectively (Coyle and Nguyen, 2020).

This last study has been used to explore changes in stated values since the Covid-19 pandemic. It shows significant gains in the value that survey respondents gave for categories like online shopping and online learning. While unsurprising, these increases suggest that there is useful information about the value that people assign to digital services via this approach.

Although there are large uncertainties about how to value them, it is a great benefit for many millions of people to be getting things that they want for free. Digital services are competing with each other for our time, but equally many of them save us time or fill our time enjoyably or productively.

As is increasingly widely appreciated, they also acquire personal data about our preferences and purchases to sell advertising or marketing analytics – the revenue sources that enable the services themselves to stay free. Free digital goods have not affected measured inflation because anything on which nothing is spent will not be included in the index; but they have been a source of gains in consumers’ living standards nevertheless.

One OECD study identified other sources of digital gains, in addition to the free apps, particularly improved quality and more choice (Reinsdorf and Schreyer, 2019). If they were incorporated into inflation measurement, they would have tended to reduce the inflation rate.

But the more that reliance is placed on digital services, the more important it becomes to have universal and affordable access. For example, there is evidence that children in lower-income households in England had worse access to lessons online during the pandemic.

And while the price of telecoms services has declined over time, the cost of data plans is a significant one for low-income households. Just under one in five UK households (4.2 million) have struggled to afford at least one of the communications services they need, and around a third among households claiming benefits, according to Ofcom. Zero prices for much-valued free digital apps do not tell the whole story.

Where can I find out more?

Who are experts on this question?

  • David Nguyen, OECD
  • Manuel Tong, University of Sussex
  • Cahal Moran, LSE
  • Rebecca Riley, Kings College London
  • Kevin Fox, UNSW Sydney
  • Erik Brynjolfsson, Stanford University
  • Paul Schreyer, OECD
Author: Diane Coyle
Image credit: grinvalds on iStock

The post How do free digital products affect prices and living standards? appeared first on Economics Observatory.

]]>
How can Northern Ireland improve its innovation ecosystem? https://www.coronavirusandtheeconomy.com/how-can-northern-ireland-improve-its-innovation-ecosystem Tue, 26 Apr 2022 00:00:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=17887 An innovation ecosystem comprises individuals, institutions and resources that connect to enable new ideas, products, processes and services to be created and brought to market. An ecosystem functions best when all elements are balanced and work together – and its performance is ultimately reflected in its innovation outputs. Northern Ireland’s record on innovation activity is […]

The post How can Northern Ireland improve its innovation ecosystem? appeared first on Economics Observatory.

]]>
An innovation ecosystem comprises individuals, institutions and resources that connect to enable new ideas, products, processes and services to be created and brought to market. An ecosystem functions best when all elements are balanced and work together – and its performance is ultimately reflected in its innovation outputs.

Northern Ireland’s record on innovation activity is relatively poor compared with England, Scotland and Wales. The nation needs to do better at translating its innovation inputs into innovation outputs and ensuring that the capacity and capability to generate new ideas is strengthened. This will help to support the vision of a 10X Economy set out by the Northern Irish Department for the Economy (DfE) to foster innovation and promote greater prosperity.

What is an innovation ecosystem?

An innovation ecosystem can be defined as a ‘network of individuals, entities, resources, and structures that join forces in a way that catalyses new products, ideas, methods, systems, and even ways of life’ (WeWork, 2020).

As with ecosystems found in nature, the innovation ecosystem does not just refer to its constituent parts. It also includes how they interact with each other in their physical environment, in this case, either to enable or impede innovation.

Although there is a range of contributors within an innovation ecosystem, the main stakeholders can be identified as government, universities and research institutions, financiers and investors, incubators and accelerators, industry, intermediaries, and entrepreneurs and innovators.

The last group comprises the principal actors, exchanging knowledge, skills and ideas within the system to obtain the capital and other resources required to generate innovation and growth.

No one part of the ecosystem operates in isolation. Similarly, the ecosystem itself operates within the prevailing economic and wider framework conditions. These are recognised as the factors external to firms that influence innovation performance and market success. They include the public research base, the business and regulatory environment, physical and digital infrastructure, demand for innovation, and the available human capital (Nesta, 2011).

What does the evidence tell us?

We can measure the success of Northern Ireland’s innovation ecosystem in terms of how it performs in both a UK and wider European perspective. We consider innovation inputs and innovation outputs, as both provide an indication of the effectiveness of the ecosystem.

Innovation inputs are resources devoted to the generation of innovation, such as spending on research and development (R&D) and the supply of skilled workers (or human capital). Innovation outputs represent the new processes, inventions and ideas brought to market.

R&D delivers the knowledge, insight and experimentation used in developing innovations. In Northern Ireland in 2020, £913 million was spent on R&D (NISRA, 2021). Almost three-quarters of this was business expenditure on R&D (BERD), just under a quarter was higher education expenditure on R&D (HERD) and the remaining 3% was government expenditure on R&D (GERD).

In real terms, BERD has performed well over the last decade, increasing by 55% since 2010 (see Figure 1). Similarly, GERD has grown by 49%, although expenditure is at a much lower level. In contrast, HERD grew by just 11% over the decade between 2010 and 2020.

In 2020, in-house BERD accounted for 1.5% of Northern Ireland’s gross value added (GVA) – a measure of the value of goods and services produced. This was similar to the UK average (1.4%) but much lower than the best performing UK region, the East of England, with spending equivalent to 3.5% of GVA.

It is also worth noting that in Northern Ireland R&D activity is highly concentrated; the top ten R&D spending businesses account for one-third of all BERD, while almost two-fifths of BERD spending is in Belfast.

Skilled labour contributes to the absorptive capacity of a firm or region as it relates to the ability of individuals to understand and apply external information, knowledge and technology, thereby enhancing their innovative capabilities.

Northern Ireland’s working age population has a higher share of people with no or low skills relative to other UK regions (Annual Population Survey, 2022). Yet employees in Northern Ireland are actually relatively well qualified, with 37% qualified to national qualifications framework (NQF) level 6 and above (degree, masters or PhD level). This compares with 34% for the UK as a whole (UUEPC, 2022).

In fact, in the UK regional context, Northern Ireland comes second only to London in terms of the proportion of employees qualified to this level (see Figure 2). The share of the Northern Irish population with foundational level essential digital skills also measures up well at 79% compared with 81% in the UK (Lloyds Bank, 2021).

Figure 1: Real expenditure on R&D in Northern Ireland, 2010-20

Source: Northern Ireland Statistics and Research Agency (NISRA)

Figure 2: Proportion of the employed (aged 16-64) qualified to NQF level 6+, 2021Q4

Source: UUEPC based on Office for National Statistics (ONS), Labour Force Survey

Northern Ireland does less well when the focus is on innovation outputs. The most recent UK Innovation survey covering the period 2016-18 shows Northern Ireland to be joint bottom of the UK regional league table in terms of innovation-active businesses (see Figure 3).

Just 32% of Northern Irish (and Scottish) businesses are innovation-active compared with a UK average of 38%. Notably, rates across the UK are lower than those reported in the previous survey for the 2014-16 period.

When broken down further into more specific components, businesses in Northern Ireland are broadly on par with those in the rest of the UK in terms of process innovation – at 12% of businesses compared with 13% in the UK.

But the share engaged in product innovation – in other words, new goods or services – is lower with just 13% in Northern Ireland, compared with 18% in the UK. The gap is still there when assessing the share of product innovators with new-to-market products: just over one-third of innovators in Northern Ireland undertake this more radical type of innovation compared with two-fifths in the UK.

Patenting activity, which gives inventors intellectual property rights in their new ideas, is also low in Northern Ireland. In 2020, there were almost 12,000 patent applications filed in the UK: of those just 153, or 1%, were from Northern Ireland (Intellectual Property Office, 2021).

Likewise, of the 4,500 patents granted, just 65 were from Northern Ireland, again representing 1% of the total. Both trademarking and designs registered there also each account for just 1% of the respective UK totals.

Given that Northern Irish businesses represent around 2% of all UK business activity, shares of 1% in these types of innovation output activities indicate that they are lower than would be expected.

In contrast, university spin-outs – companies that emerge from scientific research – represent a successful element of Northern Ireland’s innovation ecosystem. In 2020, Queen’s University Belfast was ranked first in the UK, and Ulster University 16th, in terms of their entrepreneurial impact ranking, a metric calculated according to the key indicators that influence spin-out activity at universities (Octopus Ventures, 2020).

Figure 3: Percentage of innovation-active businesses by UK region, 2014-16 and 2016-18

Source: NISRA
Note: Round point = 2014-16 level; bar = 2016-18 level

Figure 3 suggests a relatively poor performance in the UK’s innovation context. But Northern Ireland performs at or above the European Union (EU) average in a small number of innovation-related pursuits, including the above-mentioned product and process innovation activities.

Where Northern Ireland particularly excels is in the collaboration of small and medium-sized enterprises (SMEs) with other enterprises or institutions (see Figure 4). This is a form of open innovation whereby innovation is co-created with external partners.

In Northern Ireland, businesses collaborate for innovation at twice the EU average rate. This collaboration is strongest with suppliers, and private sector clients or customers, although in Northern Ireland, it is mostly undertaken at the local rather than national or international level.

Figure 4: Northern Ireland and EU relative performance in innovation activities, 2019 (EU=100)

Source: European Regional Innovation Scoreboard

How might the innovation ecosystem be improved?

The evidence summarised above suggests that despite the poor outcomes, some components of the Northern Irish innovation ecosystem are performing relatively well. But it is not enough for individual elements of the ecosystem to work.

Indeed, according to the European Commission, a balanced innovation system is needed that performs well across all dimensions. The Commission’s analysis suggests that the most innovative regions are those within innovative countries and those that perform particularly well in terms of their research system and business innovation (European Commission, 2021).

In Northern Ireland, it is encouraging that BERD has increased, but it is concentrated within too few firms and it is regionally unbalanced. And while some parts of the nation’s workforce are well qualified in a UK context, their skills capabilities are not being translated into innovation outputs.

This is particularly the case with new-to-market outputs, suggesting a potential lack of innovation prioritisation among business leaders. This deficiency of innovation activity is subsequently related to the nation’s poor productivity and economic performance.

The most recent economic strategy for Northern Ireland, a 10X Economy (DfE, 2021) emphasises the nation’s ambition for a decade of innovation. Its aim is to focus on innovation in areas of competitive advantage to deliver fundamental change resulting in a ‘ten times better economy’, which benefits all people, businesses and places.

The aim is admirable and undoubtedly there is an improving base from which to work. But the bottlenecks in the existing innovation ecosystem need to be identified and addressed. Without doing so, the ecosystem will not deliver on its potential.

Where can I find out more?

Who are the experts on this question?

  • Karen Bonner, Ulster University Economic Policy Centre
  • Steven Roper, University of Warwick
  • Nola Hewitt-Dundas, Queens University Belfast
  • Kristel Miller, Ulster University
Author: Karen Bonner, Ulster University
Photo by gorodenkoff from iStock

The post How can Northern Ireland improve its innovation ecosystem? appeared first on Economics Observatory.

]]>
What is web3 and what might it mean for the UK economy? https://www.coronavirusandtheeconomy.com/what-is-web3-and-what-might-it-mean-for-the-uk-economy Wed, 20 Apr 2022 00:01:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=17761 Associated with libertarian politics, arcane terminology and cartoon monkey avatars, the idea of ‘web3’ can be hard for outsiders to fathom. But beyond the obscurity and hype lie both opportunities and risks for the UK economy. So, what is web3? It very much depends on whom you ask. web3 promoters For its advocates, web3 marks an […]

The post What is web3 and what might it mean for the UK economy? appeared first on Economics Observatory.

]]>
Associated with libertarian politics, arcane terminology and cartoon monkey avatars, the idea of ‘web3’ can be hard for outsiders to fathom. But beyond the obscurity and hype lie both opportunities and risks for the UK economy. So, what is web3? It very much depends on whom you ask.

web3 promoters

For its advocates, web3 marks an important shift towards the next iteration of the internet. Its predecessor, Web 2.0 – the era of large, powerful social media platforms (such as Facebook) – is said to be characterised by asymmetries and injustices.

Dominated by a small number of Big Tech companies whose founders and investors have amassed unprecedented amounts of wealth and power, Web 2.0 has had consequences that are widely seen as damaging to society and democratic institutions. 

This financial success seems to have been built off the backs of Web 2.0’s users. Professional creators of music, imagery and video receive only a small fraction of the revenues that their content generates for platforms like Spotify and YouTube.

Developers of apps have no option but to pay 15-30% of their revenue to the App Store (Apple) and Play Store (Google/Android) in return for distribution. At the same time, ordinary users supply the posts, engagement and behavioural data that are integral to the advertising-based business models of Instagram, Twitter and TikTok. Despite their role as ‘prosumers’ (producing as well as consuming), they receive no financial compensation. 

By contrast, web3 is said to offer a more egalitarian, peer-to-peer vision of the web, giving all users 'skin in the game'. By using blockchain technology to decentralise the web’s technical, legal and payments infrastructure, web3 supposedly promises to sweep away today’s Big Tech companies, which are seen as abusing their market position as gatekeepers to extract economic rents

In their place will be new protocols and platforms, constituted as distributed autonomous organisations (DAOs). According to web3 advocates, DAOs will be governed by their communities, transparent in their operations and immune from capture by narrow financial interests thanks to smart contracts (self-enforcing contracts programmed in computer code). 

Transactions will take place in cryptocurrencies, with non-fungible tokens (NFTs) allowing intellectual property rights to be asserted over digital files, with benefits for creators and markets.

In time, these technologies will supposedly form the basis for a thriving economy in the metaverse – the putative 3D online world in which people will be able to work, socialise and play games in virtual reality. For now, the majority of web3 companies are focused on building the underlying ‘rails’, such as payments (for example, Ripple), technical infrastructure (for example, Aligned) and fraud detection (for example, Chainalysis). 

web3 detractors

Critics of web3 bring a very different perspective. Cryptocurrency sceptics – so-called ‘NoCoiners’ – see web3 as a cynical rebranding exercise. In their view, blockchain is a defunct technology and cryptocurrencies are scams that combine elements of Ponzi, pyramid and multi-level marketing schemes

In such schemes, a constant supply of new marks is required to provide earlier investors with liquidity – and the inevitable conclusion is collapse. These critics say that web3 should therefore be understood as a story invented to make cryptocurrency investment appear more attractive to digital creators and those who otherwise dislike Big Tech. 

Some within the crypto movement also have deep reservations about web3 – including former Twitter chief executive officer, Jack Dorsey. Here, the objection relates to the influence of venture capital investors. With more funds at their disposal than there are good investment opportunities, investors like Andreesen Horowitz – a venture capital firm based in Silicon Valley – have been highly active in developing the web3 market, through public relations and government outreach (as well as large investments in web3 companies like the NFT marketplace OpenSea). 

Outsized returns for the same group of investors who have profited from the dominance of today’s Big Tech companies are clearly at odds with the libertarian project of radical decentralisation, to which Jack Dorsey and many other crypto enthusiasts subscribe. 

What are some possible implications for the UK economy?

The criticisms levelled by web3’s detractors seem to be good reasons to reserve judgement on the overall vision for web3. It is also useful to break it down into its component parts, specifically cryptocurrency adoption, tokenisation and virtual economic growth.

Cryptocurrency adoption 

‘Cryptocurrency’ is something of a misnomer. There are very few things that Bitcoin, Ether or DogeCoin can actually be spent on – illegal drugs and NFTs notwithstanding. While cryptocurrency exchanges report billions of dollars’ worth of trading, this is overwhelmingly financial speculation and barely touches the real economy. In fact, it is possible that such speculation is channelling capital away from more productive forms of investment.

Cryptocurrency prices are also extremely volatile. According to the Financial Conduct Authority (FCA), 2.3 million UK consumershave already invested in crypto assets, meaning that a market crash might lead to large losses for retail investors. This would inevitably bring adverse consequences for consumer confidence and spending.

The same goes for fraud, which appears to be endemic to the crypto space. Meanwhile, the anonymity afforded by cryptocurrency significantly increases cybercriminals’ economic incentives to mount ransomware attacks. Affecting three-quarters of UK businesses in 2021, these involve hackers encrypting an organisation’s data and demanding a Bitcoin ransom to decrypt it. 

But UK regulators seem to be more concerned about the risk of financial instability. Most cryptocurrency is held by institutional investors, including hedge funds with leveraged positions. A collapse in crypto asset prices could force investors to sell off other assets to cover losses, reducing liquidity in the financial system and affecting investor sentiment. This could then have potential knock-on consequences for the real economy. 

As such, cryptocurrency markets can be compared to markets for derivatives such as futures and options: they represent a growth opportunity for the financial sector, but a systemic risk to the wider economy. 

But were the Bank of England to launch a central bank digital currency (CBDC), other opportunities might open up. For example, in a future downturn, the government might want to use monetary policy to stimulate economic activity. If it were to issue stimulus payments to individuals and businesses in a CBDC, it could programme in rapid devaluation, creating a strong incentive to spend rather than save, and hence increasing the effectiveness of the policy.

Tokenisation

Rather than issuing shares, web3 organisations issue tokens. These can offer rights of access to the organisation’s products, voting rights on aspects of the organisation’s decision-making, rights over digital property or a combination of all three. 

As tokens are financial assets, they can be traded speculatively in secondary markets. Much commentary has focused on cases where tokens have been instrumentalised in ‘pump-and-dump’ schemes – a form of scam where token-holders hype an asset to drive its price up sharply (pumping), before selling off their holdings (dumping) and precipitating a crash. 

Concerns have also been raised about the tokenisation of loyalty programmes and merchandise by UK football clubs, since it exposes fans to volatile crypto asset markets without obvious benefits over more conventional structures.

But from a purely economic perspective, tokenisation may prove to be an important innovation, in that it provides a new way for organisations to raise capital. The existence of the secondary market means that seed investors in web3 start-ups benefit from much greater liquidity than would be available if they bought equity. 

This can reasonably be expected to increase the pool of capital accessible to early stage tech businesses, with favourable consequences for the development of the tech sector. Given the UK’s strength in financial technology (fintech), decentralised finance (or DeFi) seems like a particular opportunity. 

Similarly, tokenisation could provide small and medium-sized businesses, which are ordinarily subject to banks’ fluctuating appetite for risk, with an alternative source of growth capital. Meanwhile, other types of organisation that typically have limited access to capital markets – including social ventures and community projects – may see issuing tokens as a scalable alternative to grant applications or crowdfunding.

Tokenisation is perhaps most advanced in the creative sector. Before the advent of NFTs, there were few incentives for producing monetisable digital artwork, as files could easily be pirated. By providing more or less immutable records of ownership for digital files, NFTs provide incentives and make it technically possible for artists to receive automatic royalties on re-sales of their work. 

Combined with the ability to sell directly to the public without intermediation by commercial galleries, NFTs seem to be making it easier for creators to develop real businesses (although it is not yet clear whether current levels of demand are sustainable).

In general, if one subscribes to the view that greater supply of capital leads to productive investment, job creation and growth, the potential of tokenisation should be taken seriously. 

Figure 1: UK consumer interest in cryptocurrencies and NFTs during the pandemic period, as measured by internet searches

Source: Google Trends

Virtual economic growth

The idea of a metaverse economy might seem particularly far-fetched, but a substantial virtual economy already exists. Sales of virtual goods inside ‘massively multiplayer online games’ (or MMOs) are estimated – admittedly by gaming industry market intelligence firms – to have amounted to $40-93 billion globally in 2019, and to be growing at a rate of around 15%. 

Many games have native currencies that can be exchanged for skins (virtual goods such as clothes or armour, which alter a player’s in-game appearance). In the video game Elite Dangerous, for example, a currency called ARX can be bought or earned through game-playing and then used to purchase livery for the player’s spacecraft.

Advocates of web3 argue that the development of the wider virtual economy is held back by the absence of property rights. Currently, a dashboard ornament purchased in Elite Dangerous cannot be taken into World of Warcraft: it remains the property of the game’s developer, Frontier Developments, which could, if they wished, confiscate it from a player who had paid for it. 

Replacing native currencies with cryptocurrency and minting skins as NFTs, on the other hand, would provide stronger incentives for third-party developers to create new ranges of virtual goods, and for players to increase their spending on them. This seems at least plausible and much more like real economic activity than cryptocurrency speculation (though it should be noted that many in the gaming community are unconvinced that it would be technically feasible or additive to the game-playing experience). 

What is more certain is the contribution of the UK video gaming industry to the economy: around £1.8 billion towards GDP and around 40,000 jobs in 2018. Larger than any of its European counterparts, it is well-placed to benefit if web3 technologies do indeed drive gaming innovations.

Conclusion

Predicting the economic impact of emerging technologies is notoriously difficult. The biggest benefits from technological change often come from positive spillovers and the biggest risks from unforeseen externalities. Which of the stories about web3 sketched out in this piece will come true is anybody’s guess. 

But increasing amounts of Silicon Valley’s abundance of capital and software engineering talent are being poured into web3 projects. And as the Web 2.0 era has shown, these decisions about where to focus energy will have repercussions for the economy and beyond. 

Where can I find out more?

  • Policy Brief: Crypto, web3, and the metaverse: A simple explanation of cryptocurrencies, blockchain, NFTs and the metaverse, together with discussion of web3’s policy implications, by the Bennett Institute for Public Policy.
  • web3 policy handbook: US venture capital investors Andreesen Horowitz make a bullish case for web3 and suggest actions that governments should take to encourage its development.
  • Line Goes Up – The Problem With NFTs: an entertaining if somewhat polemical video essay that aims to debunk claims that web3 technologies can form the basis of a more equitable internet.
  • The Crypto Syllabus: comprehensive reading lists for studying web3 from social, economic and technological perspectives, with short introductory overviews. 

Who are experts on this question?

Author: Sam Gilbert
Picture by Antonio Solano

The post What is web3 and what might it mean for the UK economy? appeared first on Economics Observatory.

]]>
#RES2022: Why should economists care about biodiversity? https://www.coronavirusandtheeconomy.com/res2022-why-should-economists-care-about-biodiversity Thu, 14 Apr 2022 00:01:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=17710 The way that economists and policy-makers think about the world is shifting. As the climate and ecological crises continue to unfold, it is becoming increasingly clear that our everyday actions have large and direct negative effects on our surroundings. Carbon emissions from our flights, farms and factories affect the air we breathe. The forests we […]

The post #RES2022: Why should economists care about biodiversity? appeared first on Economics Observatory.

]]>
The way that economists and policy-makers think about the world is shifting. As the climate and ecological crises continue to unfold, it is becoming increasingly clear that our everyday actions have large and direct negative effects on our surroundings.

Carbon emissions from our flights, farms and factories affect the air we breathe. The forests we raze for fuel, agriculture and to build homes, offices and schools leave the earth’s atmosphere starved of oxygen. And as more and more fish are hauled from our seas, the water in which they once swam is becoming less hospitable to life.

Figure 1: Forest area by country (% change in square km, base year = 1990)

Source: World Bank

According to data from the World Bank, many countries have depleted a vast share of their forests since 1990. Figure 1 highlights how Brazil has cut down around 15% of its trees, Indonesia 24% and Paraguay over a third.

Similarly, data from the International Union for Conservation of Nature (IUCN) show that there are almost 3,000 species of fish now classified as critically endangered. Nearly 2,500 amphibians are also on the red list, facing extinction, as are around 2,000 insects (see Figure 2).

Figure 2: Endangered species (number of species on red list)

Source: IUCN Summary Statistics

But how did we get here? How has the economic development of the last century come at so great a cost? These questions, as well as how best to react to the emergency at hand, were the subject of a discussion between Diane Coyle (University of Cambridge and one of the Observatory’s lead editors) and Partha Dasgupta (University of Cambridge and author of the Dasgupta Review, an independent report commissioned by the UK Treasury).

The conversation, held on the first day of this year’s Royal Economic Society (RES) annual conference, centred on both why and how we must embed our reliance on nature into economics. Diane and Partha also explored how traditional economic measures (such as GDP growth) are insufficient to capture this relationship, and discussed what needs to change about economics as a discipline to address these collective problems.

What are the main messages of the Dasgupta Review?

At the core of the review is the idea of ‘embeddedness’. For Partha, it is no longer sufficient for economists to view nature as something separate from human activity. To understand and model our use of natural resources with precision, it is vital that economists see human beings – as well as our accumulation of physical and human capital – as contained within nature.

Measurement is also an important issue. Until economists and policy-makers are able to find accurate ways to track the effects of economic activity on the natural world, it will be very difficult to price anything accurately. How will we ever know the true costs or benefits of a given investment without having a clear picture of the way that such an investment alters or shapes the natural environment around it?

For example, if developers in Sri Lanka want to build a shrimp farm on the coast, any economic benefits in terms of jobs, trade and profits must be weighed against the destruction of the immediate area around the farm. To build the facility, mangroves will need to be removed and an area of open water contained. Not only will this take away a natural filter for the region’s seawater, but it will also reduce the availability of nursery space for native fish, birds and insects to breed and lay their eggs.

As a result, to gain a complete picture of the economic costs and benefits of such a project, it is important to understand the complex network of relationships between the sea, wildlife and plants. Analysing projects simply by looking at human factors, such as income and profit, overlooks numerous other issues, giving a murky and incomplete picture of potential economic or social gains.

How should economists change the way they think?

For both Diane and Partha, embedding nature into economics is not about tearing up the rulebook. Correctly accounting for the value of the natural world does not mean doing away with economic orthodoxy.

As Partha put it during the conversation, as economists ‘we have inherited a language created by some of the most extraordinary minds of the past century… but we are misusing it.’ Embedding nature into economics is about a methodological re-orientation, not a revolution.

In fact, traditional economic thinking can help to clarify our understanding of the climate and ecological crises, if well deployed. Using a framework of scarcity of resources, for example, shows that as natural capital is depleted, its relative price increases. This means that the true cost of any activities that further reduce the stock of natural capital is set to grow continually. According to the same analysis, this implies a ‘negative discount rate’ – our forests, seas and marshlands become more and more valuable to future generations.

So, rather than viewing economics as a discipline at odds with protecting the environment, its tools should be turned to new tasks. The valuable lessons of asset management, discounting and resource optimisation – when combined with granular data informed by ecology and biology – can allow policy-makers to take a holistic view that captures the economic, social and environmental costs and benefits of different interventions. To view these decisions with maximum clarity requires using the methods already at our disposal, not rejecting the knowledge accrued over the last few decades entirely.

What next?

Towards the end of the discussion, Diane questioned the level at which governments will need to intervene to embed nature into economics. How much should be left to firms and households? Can individuals be trusted to do the right thing and protect the natural world? In a situation with missing markets and externalities (such as pollution), normally it is thought to be the role of the state to ‘correct’ the problem with policy.

To this, Partha concluded by claiming there is no ‘one-size-fits-all’ framework to follow. Different groups have different ways of managing their local environments, and we cannot assume that the ecological and climate crises will play out in a uniform manner.

Where appropriate, decision-making should be ceded to local communities who understand the challenges they face. Community is as important as technology, governance and monetary incentives when it comes to creating sustainable systems that cause minimum harm.

Social capital, therefore, is the final piece of the puzzle. Networks of people – from the local level all the way up to multinational organisations – are vital if nature is to be embedded successfully into economics.

These community bonds stretch to future generations too. As Partha reflects in the review, ‘if we care about our common future and the common future of our descendants, we should all in part be naturalists’. To care for each other, therefore, is to care for the natural world – there can be no economy without it.

Where can I find out more?

Who are experts on this question?

  • Matthew Agarwala
  • Diane Coyle
  • Partha Dasgupta
  • Cameron Hepburn
  • Cristina Peñasco
  • Dimitri Zenghelis
Author: Charlie Meyrick
Picture by Damocean on iStock

The post #RES2022: Why should economists care about biodiversity? appeared first on Economics Observatory.

]]>
Will our energy use change after the pandemic and COP26? https://www.coronavirusandtheeconomy.com/will-our-energy-use-change-after-the-pandemic-and-cop26 Fri, 04 Mar 2022 01:00:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=17073 Global energy demand dropped by 4% in 2020 – the largest fall since the Second World War (International Energy Agency, IEA, 2021). The world also experienced the first recession to register negative growth rates in all fossil fuels – with global oil demand falling by almost 10% – as well as the fastest increase in […]

The post Will our energy use change after the pandemic and COP26? appeared first on Economics Observatory.

]]>
Global energy demand dropped by 4% in 2020 – the largest fall since the Second World War (International Energy Agency, IEA, 2021). The world also experienced the first recession to register negative growth rates in all fossil fuels – with global oil demand falling by almost 10% – as well as the fastest increase in renewable energy sources as a share of electricity generation. These changes to global energy markets were a side effect of the pandemic and lockdowns.

Figure 1: Evolution of global GDP, total primary energy demand and energy related CO2 emissions, relative to 2019

Source: IEA Global Energy Review 2021

In comparison, before the pandemic, despite the 2015 Paris Climate Accord setting the world’s environmental goals to limit global warming to well below 2°C by 2050, greenhouse gas emissions reached a record high in 2019. This put the world on track for an average temperature rise of 3°C by 2050 (United Nations, 2020).

Ahead of the 26th United Nations (UN) Climate Change Conference (COP26) in November 2021, the appetite to take definitive action against climate change grew. This is especially true given the opportunity that the pandemic afforded citizens and governments to re-evaluate their objectives. As highlighted by the recent report from the UN Intergovernmental Panel on Climate Change and Figure 1 above, the period of recovery following the pandemic represents a significant opportunity for the world to move policy more decidedly in the direction of meeting the goals set in global climate agreements.

What happened to the energy mix in earlier recessions – 1975, 1982, 1991 and 2009?

The energy mix shows how the world meets its energy needs using various energy sources. This article focuses on commercially traded fossil and non-fossil fuels, as published in BP’s Statistical Review of World Energy 2021, grouping together fossil fuels (oil, natural gas, coal) and non-fossil fuels (nuclear, hydro, and modern renewables such as solar and wind).

The recessions of 1975, 1982 and 1991 all occurred immediately after a rise in the price of oil. The 1975 recession followed the 1973 oil price crisis resulting from oil embargoes by the Organization of the Petroleum Exporting Countries (OPEC). The 1982 recession came after the oil price rise in 1979, linked to the Iranian revolution and later the Iran-Iraq war. The 1991 recession followed the 1990 oil price shock, which was linked to the Iraqi invasion of Kuwait. These recessions have shaped the energy market, as the costs were not always borne equally by richer (OECD) and poorer (non-OECD) countries.

For example, OECD countries were affected most by the 1975 recession. In the aftermath of the crisis, world GDP grew at 1.5% and the effect on the global energy mix was short-lived. Figure 2 plots each fuel in the energy mix (stacked on top of each other) in exajoules (EJ, a unit of energy commonly used by industry experts). As shown, oil’s share in primary energy consumption dipped during the crisis but began to recover towards the end of the 1970s.

Figure 2: Primary energy consumption by fuel in exajoules (EJ)

Source: BP Statistical Review of World Energy, 2021

The 1982 recession affected the developed and developing world equally. It led to a slowdown in annual output growth to around 0.1% across the two groups of countries. During this recession, world primary energy consumption recorded negative growth (see Table 1), with oil consumption declining by over 3%, while non-fossil fuels experienced rapid growth.

In this sense, the 1982 recession solidified the decline in the share of oil that was precipitated by the 1973 crisis, particularly in richer OECD countries. To some extent, it triggered the beginning of the slow decline in the market share of oil consumption. That trend continues to this day: in the last 47 years since 1974, oil’s share in primary energy consumption in OECD countries has declined in all but seven years and since 2000, oil’s share has increased just once.

Table 1: Percentage change over previous year by fuel group in world recessions

Source: Authors’ calculations based on BP Statistical Review of World Energy, 2021

The 1991 recession resulted in a slowdown in consumption of all fuels, but only coal saw negative growth rates in the world as a whole (see Table 1). Table 2 indicates that this recession had greater effects on non-OECD countries, which experienced growth rates in all fuels further below their long-run averages than in their OECD counterparts.

Table 2: Percentage Change over Previous Year by Fuel Group in World Recessions, OECD and non-OECD

Source: Authors’ calculations based on BP Statistical Review of World Energy, 2021
Note: P: primary, R: renewables, H: hydro, N: nuclear, C: coal, G: gas, O: oil

The 2009 recession, resulting from the global financial crisis of 2007-09, differs from its predecessors for one fundamental reason: it did not follow a supply-induced oil price hike and was an aggregate demand shock. It followed the 2008 sub-prime mortgage crisis in the United States along with ensuing spikes in unemployment and a consumer credit squeeze.

This recession affected almost all fuels, as demonstrated by the negative growth rates in Table 1, which is also visible in Figure 2 as a temporary dip. Only renewables continued to grow and encouragingly, they grew at a level above their long-term average despite the deep recession experienced around the world.

This was one of the few recessions in which global energy consumption fell (see Table 1). But it affected the world unequally: although primary energy consumption declined in OECD countries, it still recorded positive – albeit reduced – growth in non-OECD countries (see Table 2). This implies that the fall in OECD energy consumption was pronounced enough for energy consumption to decline globally.

How is the 2020 recession different?

The Covid-19 era constituted the largest decline in primary energy consumption since the Second World War, falling by over 4%. This represents more than two and a half times the decline seen in the 2009 recession and over six times the decline of the 1991 recession.

Falls attributed to the pandemic are concentrated in fossil fuel consumption, with unprecedented global declines of 3%, 4% and over 9% in coal, natural gas, and oil consumption, respectively (Figure 2 and Table 1).

This is the first economic shock since the Second World War to affect primary energy consumption negatively in non-OECD countries (see Table 2). It is also the first recession in which all fossil fuels recorded negative growth rates in non-OECD countries.

Perhaps unsurprisingly, jet fuel and gasoline demand plummeted as planes were grounded and lockdowns took effect around the world. It is also notable that renewables enjoyed growth rates near their long-term averages in both OECD and non-OECD countries. This represents a degree of convergence towards Paris Agreement climate targets not seen previously.

After Covid-19, where do we stand vis-à-vis climate objectives?

The fall in energy demand was also reflected in carbon emissions. Global carbon emissions from energy use fell by 6.3% in 2020. This is roughly in line with the UN Emissions Gap Report 2020, which predicted a decline in total carbon emissions in 2020 of 7% due to the pandemic. Note here that we differentiate between carbon emissions from energy use and total emissions. This is because although energy use accounts for the majority of carbon emissions, other industries – most notably agriculture – also play a role.

While this fall in carbon emissions is vast by historical standards, the UN report also noted that the world will need to reduce total greenhouse gas emissions by 7.6% a year for the next ten years in order to meet the Paris Agreement’s goals on climate change (UN, 2020).

This puts into context the scale of change needed: it took a severe global recession leading to unparalleled reductions in energy and non-energy consumption to reduce carbon emissions sufficiently. The undesired outcome of the pandemic was its economic impact, with the world economy contracting by 3.4% in 2020.

To place this in relation to the 2°C target, Figure 3 plots global carbon emissions from energy use through to 2030. It uses the 2019-20 growth rate to demonstrate where we would be if we lived as we did in 2020 for the rest of the decade. Using the IEA’s World Energy Outlook as a basis, we see that by 2030, carbon emissions from energy use would be well below the level required to stay within the 2°C target.

Figure 3: Global carbon emissions from energy use

Source: Authors’ calculations, BP Statistical Review of World Energy 2021, and IEA World Energy Outlook 2021
Note: Data from 2021 onwards uses the 2019-20 growth rate to interpolate the path to 2030. The IEA’s World Energy Outlook 2020 Sustainable Development Scenario is used as a basis for comparison in 2030. This scenario is largely in line with the 2°C target.

This is quite a positive message, and the desired level of emissions seems within reach by 2030. The missing piece of the puzzle is the composition of emissions. Composition matters because different fuels have different carbon contents. This has a direct impact on the amount of carbon emitted when each of these fuels is combusted. For example, some types of coal emit twice as much carbon as natural gas per unit of energy produced.

As such, it is, in principle, possible to reduce emissions without curbing total energy consumption. This could be achieved by replacing coal with natural gas or, better yet, non-fossil fuels. Within fossil fuels, coal is the most carbon-intensive and natural gas the least. Oil ranks in the middle, and various products derived from crude oil have their own carbon intensities.

Continuing the hypothetical exercise above where we sustain 2020 trends until 2030, Figure 3 shows how the energy mix would evolve over the years. Although arguably unrealistic because this exercise assumes a continuous decline in global primary energy consumption, there are some useful messages here:

  • First, we need non-fossil fuels to continue to grow at a rapid pace.
  • Second, replacing coal with cleaner fossil fuels or non-fossil fuels would provide the energy we require without the emissions.

Figure 4: Global energy mix by fuel, 2021 onwards estimated using 2019-20 growth rates

Source: Authors’ calculations based on BP Statistical Review of World Energy 2021

What does the post-Covid-19 recovery look like?

Carbon savings realised in 2020 came mostly from the fall in oil consumption, as lockdowns, home working and less air travel meant lower carbon emissions. These changes are particularly notable in a year with such low oil prices. In 2020, the oil price was significantly lower than its five-year historical average (see ‘WTI’ compared with ‘5yr WTI’ in Table 1).

Moving forward, the transport sector is key. To highlight this point, Figure 4 shows the sectoral contribution to primary energy consumption growth in 2020 calculated using the IEA’s sectoral shares. Although transport accounted for approximately 33% of primary energy consumption during 2020, Figure 5 shows that this sector contributed over half of the fall seen in primary energy consumption. This reaffirms the importance of this sector in reaching climate objectives and therefore in appropriate policy responses.

Figure 5: Sectoral contribution to primary energy consumption growth in 2020

Source: Authors’ calculations based on BP Statistical Review of World Energy 2021 and IEA World Energy Outlook 2020

Note: The figure represents the primary energy growth in 2020 of -4.38%. In other words, the whole ‘pie’ corresponds to the -4.38% change in primary energy growth. Displayed figures are percentage contribution of each sector towards this.

The quick rise of oil prices from the depths of negative prices in 2020 indicates that oil demand may be returning to pre-pandemic levels. While rising prices are bad news for consumers in the short run, a high oil price can incentivise investments in renewables in the longer term.

But this is a complex relationship. Links between oil price and the energy mix have been weakening since the 2009 recession and the shale revolution in the United States. Due to the differences in technology required for hydraulic fracturing (fracking), producers have been able to adjust production rapidly in response to the oil price (Ersoy and Schaffer, 2020).

This has placed a ceiling on the price of oil and, to a certain extent, reduced the ability of OPEC to influence it. Since a low oil price, and thus cheap energy, tends to slow down the adoption of non-fossil alternatives, this price ceiling could limit the growth rate of non-fossil energy.

Nevertheless, renewable energy consumption growth has carried on apace in 2020. This is at least partially because investment in renewables is now driven more by national policies and long-term commitments, including the Paris Agreement, along with a growing appetite to switch away from fossil fuels.

Cost reductions have also played an important part in the growth of renewables. Hydropower, solar and wind are already cost-competitive against fossil fuel power generation in several countries. This continues to encourage governments, firms and households to invest in them. The trend is likely to continue with technological progress. Since efficiency gains in fossil fuel technology have already been largely exploited, the scales are likely to tip in favour of non-fossil fuels in the coming decades. Similar patterns had been visible in the data before now, and the all-important question has always been whether the switch would take place sufficiently quickly.

What should we expect after COP26?

While the most recent Intergovernmental Panel on Climate Change (IPCC) report (IPCC, 2021) paints a dire view of the situation, there were significant opportunities for progress at COP26. Economics establishes that we suffer from ‘ambiguity aversion’: we are not very good at taking action when risks are uncertain. But the IPCC has now made its most concrete statement that humans are ‘unequivocally’ causing climate change.

This is a statement that is borne out of multiple lines of evidence – from lived experience as well as from climate change models. In 2020 alone, we saw unprecedented extreme weather events: flooding in Germany, wildfires in Greece, Turkey and the United States. As the IPCC report notes, the costs of climate change are being witnessed all over the world. Climate change is no longer a poor-country issue; rich countries are also seeing the costs first-hand.

Such an even distribution of costs augured well for COP26, since when costs are distributed evenly, international cooperation is more likely. This was pivotal in the resolution of a previous global crisis that required international cooperation: the hole in the ozone layer, which was created by emissions of chlorofluorocarbons (Chick, 2019).

A drastic change to the global energy mix is required to achieve climate goals, including countries committing voluntarily to strand fossil fuel assets, such as discovered oil reservoirs, oil rigs and related fossil fuel infrastructure (McGlade and Ekins 2014, 2015). This puts an additional burden on non-OECD countries, as their economic development would benefit from extracting these resources.

With the correct set of policies, which recognise the inequality inherent between OECD and non-OECD countries in abatement costs, the world may make significant progress towards a more sustainable path for growth.

As far as carbon emissions are concerned, the effects of the pandemic are as close to time travel as we can get. Carbon emissions in 2020 were back to their 2011 levels. If we were back in 2011 today, would we have done anything differently? The world expected a satisfactory answer to this question from COP26. After all, we are wiser today than we were back in 2011. Yet the commitments in Glasgow were lacklustre. The pledges may bring us closer to climate goals, but there is much more to be done – especially as the world’s focus is now on post-pandemic economic recovery. As highlighted in Figure 1, early forecasts already show global energy demand bouncing back in 2021 and surpassing pre-Covid-19 levels (IEA, 2021). If realised, this would bring carbon emissions back up and we may well find ourselves back where we started. 

Where can I find out more?

Who are experts on this question?

  • Erkal Ersoy
  • Rachel Forshaw
  • Mark Schaffer
  • Dieter Helm
  • David Newbery
Authors: Erkal Ersoy and Rachel Forshaw
Photo by BerndBrueggemann on iStock

The post Will our energy use change after the pandemic and COP26? appeared first on Economics Observatory.

]]>
Firms’ digital innovation in the pandemic: how have workers been affected? https://www.coronavirusandtheeconomy.com/firms-digital-innovation-in-the-pandemic-how-have-workers-been-affected Thu, 03 Feb 2022 01:00:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=16673 The pandemic has had a major impact on businesses and their employees. Social distancing requirements, shifting patterns of demand and continued uncertainty have brought changes in working practices and the introduction of new processes, products and services, many of which are likely to stick. Key examples include increased working from home and online sales. The adoption of new digital technologies has been central to these […]

The post Firms’ digital innovation in the pandemic: how have workers been affected? appeared first on Economics Observatory.

]]>
The pandemic has had a major impact on businesses and their employees. Social distancing requirements, shifting patterns of demand and continued uncertainty have brought changes in working practices and the introduction of new processes, products and services, many of which are likely to stick. Key examples include increased working from home and online sales. The adoption of new digital technologies has been central to these changes.

In the context of the broader debate about the effects of technological change on employment and jobs (see, for example, Acemoglu and Restrepo, 2019) a key question relates to how Covid-19-induced technology adoption has and will affect workers. 

Within firms, the introduction of new digital technologies or automated processes might have labour-saving effects – meaning that fewer workers are needed – where some routine tasks can be performed more efficiently by new technologies. The risks of ‘forced automation’ on low-wage workers were highlighted early on in the pandemic (Autor and Reynolds, 2020) and have been studied empirically in the context of the United States (Ding and Molina, 2020). 

But at the same time, new technologies alter the nature of work, create new tasks and change the demand for skills, and can complement (certain types of) labour. For instance, the adoption of marketing automation technologies powered by artificial intelligence (AI) makes it possible for businesses to leverage data at scale and raises the demand for skills associated with data analytics. 

A survey of recent studies on the effects of automation on labour demand finds more empirical support for a positive impact on employment overall at the firm level (Aghion et al, 2022). Automating firms can become more productive and grow, generating new jobs (potentially at the expense of their competitors, though relationships at the industry level are also positive).

The Centre for Economic Performance (CEP) and the Confederation of British Industry (CBI) have conducted two bespoke business surveys, the first in July 2020 and the second a year later, to generate timely data on the extent, nature and effects of technology adoption in response to the crisis. 

The second survey included a series of questions that sought to shed light on how technology adoption since the onset of the pandemic has affected the workforce. The rise of working from home and the associated increased flexibility have clearly been enabled by the widespread adoption of remote working technologies such as Zoom or Microsoft Teams. But this work set out to explore how effects on workers vary according to the types of technology introduced, or the characteristics of firms doing the adopting.

How has Covid-19 affected technology adoption in businesses?

As set out in a previous Economics Observatory article, the effects of a crisis on technology adoption are theoretically ambiguous (Valero and Van Reenen, 2021). But the evidence to date suggests that the pandemic has accelerated technology adoption in firms, in part due to the nature of the crisis (requirements for social distancing) and the readiness of digital technologies that allowed firms and workers to adapt quickly in sectors where this is feasible. 

Three months into the crisis, the first CEP-CBI survey found that over 60% of businesses had adopted new digital technologies (Riom and Valero, 2020). This compared with 13% engaging in ‘process innovation’ – defined as making ‘significant changes in the way that goods or services are produced or provided’ – over the three years to December 2018 (UK Innovation Survey). Other studies at that time also pointed to increased digitisation (Be the Business, 2020ERC, 2020CBI, 2020). 

The second CEP-CBI survey, covering an additional 12 months, reaffirmed the strong innovation response among UK businesses. Three-quarters of firms adopted digital technologies over this timeframe (see Figure 1). Over half (55%) had adopted new digital capabilities and nearly 70% had adopted new management practices. In addition, over 60% of firms had introduced new products and services. 

Figure 1: Overall innovation response to Covid-19 (March 2020-July 2021)

Source: CEP-CBI survey, 2021. Notes: N=425, N=393, N=388, N=376 responded to each question, respectively

In terms of timing, while the adoption of digital technologies and management practices occurred early on (March-June 2020) at many firms, a large share of firms continued to innovate beyond the initial lockdowns. In contrast, the share of firms adopting new digital capabilities was constant, while product innovation increased over time.

But were these activities induced by the pandemic? And are new processes and products here to stay beyond it?

Most firms considered that the pandemic had accelerated their innovation plans, and between 11% and 34% of firms (across innovation types) reported that the pandemic actually prompted them to innovate. Having made these changes, most firms expected them to outlast the pandemic. Further, the reported effects on business performance are broadly positive, in particular with respect to resilience. 

These findings suggest that Covid-19 is building the innovative capacity of businesses, which is a cause for optimism with regards to the UK’s productivity puzzle. It is worth noting that the respondents in the CEP-CBI survey were larger than the typical UK firm. Indeed, technology adoption was more likely in larger, more digitally advanced firms. These firms were also more likely to report increased business resilience as a result of adoption – evidence that could point to a widening digital divide in the future.

This type of uneven technology response has been found in other surveys, including those focused on remote working in OECD countries, and digital technologies internationally. In the UK, the Office for National Statistics Business Impact of Covid-19 survey (ONS BICS) conducted in August 2021 (wave 38) included similar questions on technology adoption and found that on a larger and more representative sample, 28% of businesses had adopted new technologies (the share is higher among businesses with ten or more employees). 

While this is substantially smaller than the 75% in Figure 1, it still represents a rise compared with pre-pandemic ‘process innovators’ discussed above. In addition, 46% of the BICS sample answered that they were ‘not sure’ when asked about technology adoption – versus a negligible share in the CEP-CBI survey where the respondents were perhaps more used to being asked detailed questions on technology-related issues.  

What technologies have businesses been adopting?

While remote working technologies are perhaps the most commonly discussed among adopters in the CEP-CBI survey, new technologies also related to a range of business functions, from logistics to security. The most cited were sales and marketing, followed by people management and remote working, with around 70% and 65% of adopters, respectively, selecting these (see Figure 2). 

Figure 2: Business functions that newly adopted digital technologies relate to

Source: CEP-CBI survey, 2021. Notes: N=307 (Adopters that answered this question)

Technologies for remote working – such as video conferencing or collaboration technologies – were the most adopted specific technology types, either alone or in ‘bundles’ with other technologies including online marketing tools (for example, a new website or social media platforms/e-commerce), cloud, data analytics and cybersecurity.

How have newly adopted digital technologies affected work?

In terms of overall impact on the size of the workforce, most firms (63%) reported that newly adopted digital technologies had no impact (see Figure 3). The share saying that they had reduced the need for workers (16%) was similar to the share reporting an increased need (13%). Given that the furlough scheme was in operation during this period – and a high share of survey respondents, over 70%, had accessed it – it was potentially too early to detect changes with respect to employment resulting from new digital processes.

There was more evidence of changes in working practices. An increase in flexible working stands out, with 75% reporting an increase. But training, productivity and worker satisfaction were also reported to have risen in over 40% of firms that adopted new technologies.

Almost half (45%) stated that they had reorganised staff or reallocated employees to new tasks as a result of new technology adoption. And while most firms reported no change to working hours, 18% reported a rise due to the implementation of new technologies.

Figure 3: Effects of technology adoption on workforce

Source: CEP-CBI survey, 2021. Notes: N=319 (adopters that answered this question)

There were also associated changes in hiring activity. Over a quarter reported an increase in hiring in specialist skills, and a similar share stated that they were hiring from a broader geography than before the pandemic (see Figure 4). This is a phenomenon enabled by increased remote working and brings implications for the future shape of cities(Nathan, 2021). 

Remote working was still prevalent in July 2021, and at that point – before Omicron emerged and guidance changed – expectations for working patterns for January 2022 looked very different to pre-pandemic, with one to two days at home being the most popular option among firms.

Figure 4: Effects on skill needs and hiring

Source: CEP-CBI survey. Notes: N=319 (Adopters that answered this question)

How have effects varied by firm and technology type? 

Firms that had adopted new digital technologies prior to the pandemic were more likely to report improved worker productivity because of new technology adoption, controlling for other key business characteristics, such as firm size, age, sector, human capital and location. But they were less likely to report an increase in flexible working. Perhaps such firms, being more technologically advanced, had fewer ‘teething’ problems with new technologies and already had flexible working practices in place. 

Importantly, firms with a higher degree share among employees were more likely to report an increase in overall workforce size as a result of technology adoption, consistent with new digital technologies and skills being complementary.

There were also different effects according to the type of technology being adopted. Where firms adopted technologies relevant for people management, remote work or other ‘business-as-usual’ tasks, they were more likely to report increased flexible working. 

Firms that adopted remote working technologies also saw a rise in average working hours. This is consistent with other studies that have shown that home workers have put in more hours since Covid-19 (see, for example, Barrero et al, 2021). Firms that adopted new technologies relevant for research and development (R&D) functions were more likely to report an increase in workforce size.

What are the implications for skills policies?

Covid-19 has increased technology adoption and other types of innovation in businesses, and these activities are affecting the workforce. Firms considered that skills constraints were the biggest barrier to technology adoption in the second CEP-CBI survey, perhaps reflecting more widespread shortages at that time. 

This highlights the importance of skills and training policies for innovation in businesses, as well as building workers’ resilience to changes induced by Covid-19 and broader transitions during this coming decade, including the move towards net zero and changes due to Brexit. This is particularly relevant against a background of skills gaps and declining on-the-job training in firms (Li et al, 2020).

Continued digitisation in businesses will occur at the same time that the move towards net-zero emissions brings changes to strategies, operations and jobs. Indeed, environmental considerations are being taken into account to either a great or some extent by around two-thirds of firms in the CEP-CBI survey. 

This is happening across a number of areas, including decisions on technology adoption, working practices and office design or location. It presents an opportunity for a joined-up approach to business support and skills policies to boost productivity growth in an inclusive and sustainable way.

What else do we need to know?

The new CEP-CBI survey data provide a first look at the potential effects of pandemic-induced technology adoption on the workforce. There are well-known caveats in the analysis of self-reported performance measures. In particular, we might expect biases that result in firms over-reporting socially desirable behaviours (such as positive effects of adoption on company or worker performance) and under-reporting socially undesirable behaviours (such as a reduction in headcount). But knowing the responses were anonymous should mitigate these effects. Merging business surveys with administrative data will allow researchers to track the actual performance of firms, and trace the effects of technology adoption on employment and other worker outcomes over time.

Where can I find out more?

Who are experts on this question?

  • Anna Valero
  • Capucine Riom
  • Juliana Oliveira-Cunha
  • John Van Reenen
  • Nick Bloom
  • Chiara Criscuolo
  • Diane Coyle
  • David Autor
Authors: Anna Valero, Capucine Riom, Juliana Oliveira-Cunha
Photo by shironosov from iStock

The post Firms’ digital innovation in the pandemic: how have workers been affected? appeared first on Economics Observatory.

]]>
Should central banks develop their own digital currencies? https://www.coronavirusandtheeconomy.com/should-central-banks-develop-their-own-digital-currencies Tue, 11 Jan 2022 01:01:00 +0000 https://www.coronavirusandtheeconomy.com/?post_type=question&p=16182 The use of cash is declining across the world, falling by 35% between 2019 and 2020, according to a recent UK Finance report. In the UK, cash accounted for just a fifth (17%) of all payments in 2020, down from more than a half (56%) a decade earlier. With over a quarter of all payments […]

The post Should central banks develop their own digital currencies? appeared first on Economics Observatory.

]]>
The use of cash is declining across the world, falling by 35% between 2019 and 2020, according to a recent UK Finance report. In the UK, cash accounted for just a fifth (17%) of all payments in 2020, down from more than a half (56%) a decade earlier.

With over a quarter of all payments in the UK made via contactless methods, consumers are looking for convenient ways to spend their money in a digital world. The banking sector as a whole is starting to increase its digitalisation with the emergence of digital banks such as Monzo, Revolut and Starling in the UK, and the growth of vendors such as Alibaba’s Ant Financial and Tencent’s WeBank in China’s financial sector.

One of the biggest changes coming to the banking sector will be the emergence of central bank digital currencies (CBDCs), which are digital currencies created and controlled by central banks such as the Bank of England in the UK, the European Central Bank (ECB) in the Eurozone and the Federal Reserve (the Fed) in the United States.

Comparisons are often made with cryptocurrencies since some proposed CBDCs could make use of the ‘blockchain’ technology that is used in many popular cryptocurrencies. But CBDCs will be controlled by central banks via their own private blockchains to ensure privacy and avoid the many security and volatility issues faced by cryptocurrencies. As a result, CBDCs will be quite distinct from cryptocurrencies such as Bitcoin and Ethereum.

In this article, we explore what CBDCs are, why governments are trying to create them, how far advanced they are, and the risks associated with these new currencies.

What are CBDCs?

CBDCs are forms of regulated, government-issued electronic money. While most cryptocurrencies, like Bitcoin, are decentralised assets and a pure ‘peer-to-peer’ version of electronic money (Quinn, 2021), CBDCs will be governed by central banks such as the Bank of England, the ECB and the Fed.

CBDCs are being developed to replace national currencies and move to a cashless society.  Indeed, 86% of central banks are actively researching CBDCs, 60% are experimenting with CBDCs, while 14% are deploying pilot projects, according to a recent Bank for International Settlements (BIS) survey.

The Bahamas became the first nation to introduce CBDCs with the ‘sand dollar’ in October 2020, while Nigeria became the first African country to launch a digital currency – the eNaria – in October 2021. In China, the digital renminbi (e-CNY) is being developed for cross-border use, while in the United States, two CBDC initiatives are under way. In September 2021, Fed chair Jerome Powell said that the central bank is ‘working proactively to evaluate whether to issue a CBDC… I think it’s more important to do this right than to do it fast’.

Why are governments trying to create CBDCs?

There are several reasons why governments might introduce CBDCs. Here, we discuss some of the most important motivations.

First, there is a threat posed by cryptocurrencies and ‘stablecoins’ like Tether. Almost every central bank has written white papers on cryptocurrencies. While many cryptocurrencies could never replace a national currency due to practical reasons – such as the large transaction fees, scalability issues in terms of transactions and the large volatility (see Quinn, 2021 for an excellent discussion) – central banks are concerned that they are being left behind in this digital revolution and want to move with the times. The growing interest and use of cryptocurrencies are a challenge to national currencies and issuing CBDCs will help counteract that growth.

Second, CBDCs should improve the efficiency and safety of both retail and large value payment systems. On the retail side, the focus is on how a digital currency can improve the efficiency of making payments, for example, by speeding up transactions at the point of sale, online and peer-to-peer. There could also be benefits of having a CBDC for wholesale and interbank payments since, for example, it could facilitate faster settlement and extended settlement hours. Further, CBDCs could improve cross-border payment efficiency. They have the potential to improve counterparty credit risk for cross-border interbank payments and settlements by offering 24-hour availability, anonymity and eliminating counterparty credit risk for participants.

Third, the introduction of CBDCs would speed up the transition to a cashless society. Cash use is falling at a dramatic rate due to the ease of payments using cards, apps and contactless payments. Cash costs money to mint – for example, a $100 note costs 14 cents to print – so a cashless society reduce costs for central banks. Cash is also difficult to trace, which makes it attractive for tax evasion, money laundering and illegal transactions. It poses a greater security risk when transporting funds and making payments as there is no record of exchange. It could be that future governments wish to remove cash to reduce crime and improve tax receipts.

Fourth, CBDCs may improve financial inclusion. More than 1.7 billion adults around the globe (and 4% of the UK population) are ‘unbanked’, referring to a person ‘not having access to the services of a bank or similar financial organisation’. CBDCs could promote financial inclusion among these unbanked populations by giving them access to a safe place for their savings and eventually, access to credit.

What are the risks of CBDCs?

One concern about CBDCs is that they would require centralisation of the banking sector, which would amplify the threat of cyber-attacks. Just as the failure of any one bank erodes confidence in banking, a CBDC could potentially relocate this risk to central banks. This would negate the benefits of strategic risk-sharing structures and distance between participants in the financial system. 

That said, the technology of the blockchain is very secure and transactions are highly compartmentalised, which means that the central bank could potentially operate a distributed system, thereby spreading the risk and consequences of any possible cyber-security breach more widely.

A CBDC could also represent a potential encroachment on consumer privacy and protections. With a CBDC, the central bank could easily block the use of funds of individuals or groups who fall out of favour with the government. The use of money (a public good to which equal access is a human right) and how it is saved, sent, spent and secured, should be as free as possible while maximising the penalty on bad actors. US Congressman Tom Emmer wrote: ‘Central banks increase control over money issuance and gain insight into how people spend their money but deprive users of their privacy,’ adding, ‘CBDCs would only be beneficial if they are open, permissionless and private.’

There is a concern that financial inclusion has declined further during the pandemic, as efforts to digitise money have been supercharged. This could be exacerbated with the introduction of CBDCs as they may be beyond the reach of those with older devices or without access to digital wallets. Care will be needed to avoid further disenfranchising the old, poor and vulnerable.

The future

It is inevitable that central banks will issue CBDCs in the future given the dramatic move to online banking and the speed of digitalisation. The design of these CBDCs may differ substantially across nations, but in all cases, the central bank will still be in charge of the currency.

They will no doubt disrupt the banking industry and enable more people to be banked, offer faster services and deliver credit to businesses on better terms, while also preserving liquidity and efficiency in capital markets. While some degrees of privacy will be lost, the benefits from protection against fraud and other crimes may more than compensate.

Where can I find out more?

Who are the experts on this question?

Author: Andrew Urquhart, University of Reading
Photo from Wikimedia Commons

The post Should central banks develop their own digital currencies? appeared first on Economics Observatory.

]]>