Keeping tech under control

Technology is developing ever faster. That calls for guidance. Our own moral conscience should provide that grip, experts say. “If it doesn’t, technology will come back and bite us in the ass.”

Just a news report from Reuters news agency, earlier this year: “Nvidia Corp (NVDA.O) on Tuesday announced new chips and technologies that it said will boost the computing speed of increasingly complicated artificial intelligence algorithms. It steps up competition against rival chipmakers vying for lucrative data centre business. Nvidia also announced its new supercomputer Eos, which it said will be the world’s fastest AI system when it begins operation later this year. Nvidia said the new technologies together will help reduce computing times from weeks to days for some work involving training AI models.”

Sounds spectacular. At the same time, it will not be the last development in the quest for faster technology, which is coming at us at an ever faster speed. Across the full spectrum of technology, in both hardware and software.

For instance, not so long ago, a new version of JavaSE was released every 2 to 2,5 years. Between JavaSE 6 and 7 there were even 4,5 years. And between JASSE 8 and 9, about 3,5 years. But ever since things have moved fast. Every six months a new version is released with many improvements. While the last update has only just been released, the next three are already on the horizon. The acceleration of technological developments makes it increasingly difficult for us to understand the consequences. Moore’s Law, the groundbreaking principle first drafted by Gordon Moore in 1965 says that the number of transistors in a dense integrated circuit (IC) doubles about every two years. That law might be obsolete soon, some researchers say.

A black ball

New technology is like grabbing balls from an urn, as Swedish philosopher Nick Bostrom puts it. “The balls represent possible ideas, discoveries, technological inventions. Over the course of history, we have extracted a great many balls – mostly white (beneficial) but also various shades of gray (moderately harmful ones and mixed blessings)”, he writes in his essay The Vulnerable World Hypothesis. “The cumulative effect on the human condition has so far been overwhelmingly positive, and may be much better still in the future. What we haven’t extracted, so far, is a black ball: a technology that invariably or by default destroys the civilization that invents it.”

Well, that might be about to change. Thanks to increased brainpower and the acceleration of new technology, we are at a turning point of reflection. The same power that offers solutions offers destruction. That might always have been the case. Take a knife as an example. A great invention that made our lives much easier. But it also killed many people throughout history. A big difference is the instant scale of impact of new technology. “When it comes to using artificial intelligence to design new drugs, the rules are simple: therapeutic activity is rewarded while toxicity is penalised”, a recent article in the Financial Times points out, for example. “But what happens if the rule is flipped, so that toxicity is rewarded? Those same computational techniques, it turns out, can be repurposed to design potential biochemical weapons. AI-designed drugs now have a dark side: AI-designed toxins.”

Never go back

“That’s a super interesting issue”, says Casper Jaspers. “What happens when you turn an algorithm around? More than ever we should be aware that technology offers both solutions and problems.” Jaspers is a lawyer and writer who specializes in the field of technology and the way in which governments and companies can and may use that technology. His most recent book Technology and Governance offers an extremely interesting helicopter view of technological development and its acceleration. “If you see technology as fishing balls out of an urn, you can at least say that you can’t put the balls back. Once something is developed, we can never go back. Often, the first thing you see in a running-in phase is a kind of resistance to a new technology. Once you have passed that stage, things often move quickly. Much faster than you would have thought possible at the outset.”

A QR-code in Jaspers’ book refers to a -in hindsight- funny film in which various Dutch citizens were asked about mobile phones at the end of the last century. The general response: we can do just fine without these things and we don’t really see the point of them. “That was only relatively recently”, says Jaspers. The unintended consequences of mass mobile phone use over the years were difficult, maybe impossible to predict at the time the video was shot. “The more complex technology becomes, the harder it is to estimate how that technology will react in the world around us”, says Jaspers.

The acceleration of technology means that developers, programmers and other tech workers are becoming smaller and smaller cogs in an increasingly complex whole. “That is precisely why we need generalists. People who know a lot about tech, but who also master other disciplines and who can fathom the consequences of technological developments looking at the bigger picture”, says Jaspers.

So far we haven’t extracted a black ball that will destroy our civilization

Hunting for bias

This ties in precisely with the vision of author and tech entrepreneur Jim Stolze. “The great tragedy of today’s data scientist is that he or she is becoming part of a very large machine. They can no longer see what their role, responsibility, let alone the implication is of what they do”, Stolze explains. “There was a time when it was suggested that developers should take a kind of Hippocratic Oath promising not to hurt anyone. That is nonsense of course, because it’s impossible for them to know the consequences of their work. Many developers tell me that they have no idea what they’re doing: ‘I’m optimizing a library and I have no idea to what end’.”

According to Stolze, valuing data will be essential in the coming years. The entrepreneur recently sold his company Aigency, which built useful artificial intelligence applications. His brand-new company is called Keuringsdienst van Data (Data Inspection Service). “I work with eleven data scientists who have all studied ethics, philosophy or law. We put on a white coat, so to speak, go to a company and do an audit of their data. We can value the quality of it, but we mainly go hunting for bias and all sorts of other unauthorized things, so that the system is okay before it is put into practice.” Especially in a time when technology is developing at lightning speed, this kind of reflection is indispensable, Stolze believes.


The question remains when we will no longer be able to keep up with the speed of technological development. At the TU/e, for example, Professor Joaquin Verschoren is working on machine learning models that make machine learning models, Stolze points out. “And that makes everything go even faster. His work produces a very interesting matruschka effect. Some people will argue that this brings on the Singularity or a moment where normal rules don’t apply anymore. Singularity is a theoretical moment anyway, and I think it is in a sense unnecessary to think about a time when computers will be faster than people. Before that happens, we will probably have merged with technology and our neocortex will be connected to supercomputers”, predicts Stolze. “What remains important, however, is that we do not lose sight of our moral conscience in this development. If we don’t, technology will  come back and bite us in the ass.”


This is an article from The Spartan, a magazine by WAES. The Spartan is being published twice a year.

Want to receive your copy? Send an e-mail to