Initially touted as revolutionary and progressive in the 1990s, the lightening evolution of digital technologies, running on the coattails of continuous innovation, has been accompanied by the rise of both extreme socio-economic inequalities and loud and widespread populism, nationalism and overt racism. Many countries are undergoing de-democratization processes undergirded by very resilient neoliberalism, while claim-making by conservative political actors has gained considerable ground in the always contentious political arena.
The unexpected and devastating pandemic triggered by the accelerated spread of the SARS-COV-2 virus has put into evidence the real constraints of a now aging and highly monopolistic digital sector. While information and communication tools and platforms are indeed
Having been trashed for the last forty years or so, Governments have unexpectedly taken back center stage thanks to the Covid19 pandemic. The virus does not need a passport to travel around the world, nor any tough immigration legislation has managed to prevent it from freely crossing national borders. No country will be spared seems to be its harsh mandate, in a world where technology and globalization permeate most human interactions. Highly contagious, the only way known today to decelerate its spread is by minimizing direct human contact. In the absence of a global governance mechanism, only national governments can take effective action.
When China first opted to completely shut down Wuhan earlier in the year, the usual suspects immediately criticized the action as “authoritarian” and
In 1988, the Brussels-based Centre for Research on the Epidemiology of Disasters (CRED) launched the Emergency Events Database (EM-DAT) with the idea of promoting national and international humanitarian support to countries and regions affected by such events. Having a structured set of global data on the subject can also help policy and decisionmakers develop more comprehensive preparedness plans, properly assess vulnerabilities and facilitate on the ground interventions based on previous experiences.
Disaster data included in EM-DAT must fulfill at least one of the following conditions. 1. 10 or more people dead. 2. 100 or more people affected. 3. A declaration of a state of emergency. And 4. A call for international assistance. Data coverage starts in 1900 and covers 230 countries and
Founded almost 40 years ago with the financial support of the MacArthur Foundation, the World Resources Institute (WRI) is one of the U.S most prominent research organizations working on environmental issues since its inception. The entity centers its efforts on scientific research and development while explicitly ignoring “ideology” or fostering activism. WRI has a wide range of scientific publications that have made outstanding contributions to the field over the years.
Last month, WRI published a paper, the 4th of an ongoing series, identifying the policies and technologies the U.S. will need to adopt to undertake carbon removal at scale. The publication offers four broad options, each discussed in detail, backed by relevant research and data, and linked to clear investment strategies
Lack of data is certainly not one of the issues at the table when discussing energy production and carbon emissions. Well-known sources for the former include the UN Statistics Division, the International Energy Agency (IEA), the U.S. Energy Information Administration (EIA), and British Petroleum (BP). The latter publishes an annual report while IEA data is behind a paywall. EIA offers open data access to a vast number of resources, including international carbon emissions. The main source for the latter is the Global Cabon Project created in 2001 and operating as an international partnership. The World Bank has carbon emissions data starting in 1960, but updates seemed to have stopped in 2014. The Global Carbon Atlas, initially funded by the BNP Paribas Foundation, is a good secondary source
As expected, ICOs are finally cooling down. There are several reasons for this. First, ICO oversight by regulators in many countries has substantially increased. Regulators are poking not so much into new ICOs. Instead, they are doing deep dives into those that have already been completed and going after those who look fraudulent. Second, the token market is in a massive downswing. Some tokens have lost at least 90 percent of their value, leading to substantial losses for ICO investors. As a result, crypto tokens have become much less attractive.
Third, many of the successfully completed ICOs have a hard time showing or delivering on the ground results despite massive infusions of capital. While lack of maturity and technology constraints might play a role here, it may also be too early
In a world where perfect information supposedly rules across the board, uncertainty certainly poses a challenge to mainstream economists. While some of the tenets of such assumption have been already addressed – via the theory of information asymmetries and the development of the rational expectations school, for example, uncertainty still poses critical questions.
For starters, uncertainty should not be confused with risk. The latter in a nutshell can be quantified using probability theory. Based on existing data and previous behavior, we could
say predict there is a 75 percent chance investments in the stock market can yield a 25 percent reward in say 5 years. This is not the case for uncertainty as here the outcome is entirely unknown. In other words, we have no idea what is going to
According to the latest estimates, global Internet penetration was close to 54 percent by the end of 2017. That is roughly 4 billion people. Figures for the number of unique cell phone users show that 5 billion people have access to the technology.
Armed with these numbers, I asked a business acquaintance who is a blockchain enthusiast and practitioner if the most popular blockchain platforms could effectively cater to all those users. Answer: “Not at this moment. But do not worry, we are working on it.”
The reason for this stems from the scalability constraints the most reputed blockchain platforms face. As I see, the scalability issue is related to three factors:
In the short and medium-term, technology and inequality seemed to be positively correlated. In the long term, however, things are not as clear-cut. With the right policies and democratic institutions in place, technology could become a catalyst to reduce income and wealth inequality. Historical evidence from the last century clearly supports this claim. Will digital technologies of the 21st Century follow the same path?
The long-term is still quite a few years away for digital technologies such as AI and blockchains. In this post, I will look at the world of Bitcoin and explore its links to income and wealth inequality. I will assume the Bitcoin network is a country on its own with defined financial ties to the rest of the world, mostly via crypto exchanges and miners.
Last May, the Bitcoin
In a previous post, I pushed the idea that mining is part of the blockchain economy’s real sector. Unlike financial speculation, mining requires investment in hardware, electricity, space, human resources, etc. This also applies to small miners who will undoubtedly have to defray a lower investment amount but can join a mining pool to share mining revenues. Also, miners face intense competition, which reflects the high level of profitability in the sector.
Mining calculators seem to proliferate on the web. Such sites offer potential mining investors a rough idea of how much they can make daily and/or monthly, given the crypto’s current price being mined and the hashing the investors are willing to purchase. For example, I am told that if I buy Bitcoin mining hardware that can compute 100