The proliferation of top, best, fails and prediction posts on almost any topic is now a staple of the annual transition from one year to the next. As the new year starts to see the light of day, we seem to be compelled to take stock of the previous 365.25 days and poke more in-depth into the short past. Regular note-taking, logging and recording are, among others, part of the task. The end of a decade calls for more elaborate efforts given the period. A few attempts are even more ambitious and, for example, recommend the 100 books one must read before dying. A bit over the top, perhaps. One could spend a whole year just trying to catch up with all these posts in any event. A better strategy is to focus on areas of interest or specialization. Books, films, social sciences and technology capture
While the dystopian camp perceives digital technologies as a formidable, perhaps even unsurmountable threat to society, those on the other, much more optimistic side do not seem to get tired of repeating its almost countless benefits. The latter camp apparently has the upper hand, at least for now, as its message captures most daily media headlines, mainstream and otherwise. Doom technology scenarios occasionally take center stage when one global personality decides to warn us, once again, about the war we are about to lose should technology be left to its own devices.
Despite such opposing views, both camps share the idea that technology is just like Frankenstein, a human creation that somehow has acquired a life of its own, a distinct personality and a determined will. If we are on the
Initially touted as revolutionary and progressive in the 1990s, the lightening evolution of digital technologies, running on the coattails of continuous innovation, has been accompanied by the rise of both extreme socio-economic inequalities and loud and widespread populism, nationalism and overt racism. Many countries are undergoing de-democratization processes undergirded by very resilient neoliberalism, while claim-making by conservative political actors has gained considerable ground in the always contentious political arena.
The unexpected and devastating pandemic triggered by the accelerated spread of the SARS-COV-2 virus has put into evidence the real constraints of a now aging and highly monopolistic digital sector. While information and communication tools and platforms are indeed
A recent paper published under the auspices of Google Health makes a case for using deep learning algorithms to improve breast cancer detection. The research has been positively received by most and widely publicized as yet another victory of smart machines over weak, dumber humans. Only a few have been critical for good reasons. In this post, I will explore the methodology used in the research to highlight other critical issues. But before I take the dive, let me first set the scene.
Like education or justice, health is an information-rich sector, thus prone to rapid (not just digital) technology innovation and overall digitization. Unlike its peers, most if not all health-related services use a gamut of technologies, from simple thermometers and stethoscopes to noisy, giant
As Artificial Intelligence (AI) seemingly continues to permeate all interstices of society, measuring its undaunted progress in the age of data is more than a priority. In a previous post, I share some insights on the Global AI Readiness Index that covered almost all UN member states. The new Global AI Index (GAII), created by Tortoise media with the support of experts from government, academia and the business sector, is geographically less ambitious but aims at a more sophisticated target. It covers 54 countries and its core goal is not readiness but rather capacity. The company informs us that the index is a response to demands from some government on the subject. However, the report is intended not only for governments but also for businesses and communities.
In spite of obvious differences,
Trade is one of the main trademarks of the globalization process. Nowadays, most countries exchange products and services regularly and use local comparative advantages to specialize in specific trade sectors and/or commodities. Food and agricultural products are important components of this process. Within countries, rapid urbanization has increased the demand for food. Simultaneously, the number of people working in the agricultural sector and living in rural areas has decreased substantially. While some food staples are imported, others are still produced locally but must travel from rural areas to urban centers and big cities to meet the demand.
Food products are thus in perpetual motion, moving from their place of birth as soon as possible towards a wide variety of geographic locations,
In the last decade, Artificial Intelligence (AI), including siblings machine learning and deep learning, has been growing by leaps and bounds. More importantly, the technology has been deployed effectively in a wide range of traditional sectors bringing real transformational change while raising fundamental socio-economic (joblessness, more inequality, etc.) and ethical (bias, discrimination, etc.) issues along the way. As it stands today, AI, understood as a set of still-evolving technologies, seems poised to become a general-purpose technology that could leave no stone untouched.
As with other digital technologies, most developing countries face the daunting challenge of harnessing AI to foster national human development Prima facie, AI looks mostly like software, code that one can
A recent piece in MIT’s Technology Review nicely summarizes the issue of bias in AI/ML (AI) algorithms used in production to make decisions or predictions. The usual suspects make a cameo appearance including data, design and implicit fairness assumptions. But the article falls a bit short as it does not distinguish between bias in general and that which is unique to AI.
Indeed, I was surprised to see the issue of problem framing as the first potential source of AI bias. While this might occur in some cases, this is not an issue that only pertains AI projects and enterprises. For example, large multinational drug companies indeed face a similar challenge. Nowadays, almost none of them are investing in developing new antibiotics to stop the spread of the so-called superbugs nor have any interest
In the previous post, I provided a simple definition of an algorithm to then explore their use in the digital world. While algorithms live from the inputs they are feed, digital programs such as mobile apps and web platforms are comprised of a series of algorithms that, working in sync to, deliver the desired output(s). Algorithms sit between a given input and the expected output. They take the former, do their magic and yield the latter.
There is a direct relationship between the complexity of the planned output(s) and the coding effort required. The latter is usually measured by the number of coding lines in a given program. For example, Google is said to have over 2 billion coding lines (2×10^9) supporting its various services. You certainly need an army of programmers to create, manage
While the concept of algorithm has been around for centuries, the same cannot be said about algocracy. The latter has recently gained notoriety thanks in part to the renaissance of Artificial Intelligence and Machine Learning (AI/ML) and is frequently used to describe the increased use of algorithms in decision-making and governance processes. Indeed, the so-called Singularity could be seen as an extreme and seemingly irreversible algocracy case where humans lose the capacity to control superintelligent machines and might even face extinction. Not sure that will ever happen though.
A more plausible scenario takes place when humans and human institutions blindly rely on algorithms to make critical decisions. This is happening today in many sectors – the quasi-dictatorship of algorithms. In