Quantcast
Channel: Business - Tim Worstall
Viewing all articles
Browse latest Browse all 761

This is problematic

$
0
0

But here’s the thing: the industry is now addicted to a technology that has major technical and societal downsides. CO2 emissions from training large machine-learning systems are huge, for example. They are too fragile and error-prone to be relied upon in safety-critical applications, such as autonomous vehicles. They incorporate racial, gender and ethnic biases (partly because they have imbibed the biases implicit in the data on which they were trained). And they are irredeemably opaque – in the sense that even their creators are often unable to explain how their machines arrive at classifications or predictions – and therefore don’t meet democratic requirements of accountability. And that’s just for starters.

He’s at least partially right there, AI systems do incorporate the biases of the underlying material. As they have to of course. We desire that they uncover things about reality. Reality is biased. Therefore the AIs have to be biased – in exactly the same way society itself is – in order to be useful at describing reality. The idea of unbiased AIs therefore fails on this particlar point. The entire idea disappears in that puff of logical smoke.

But this: “democratic requirements of accountability”. What in buggery does that mean? Markets don’t meet that standard. Ah, that’s what it does mean in fact. If something is going on that politics cannot control then that thing should not be allowed to go on. Thus that antipathy to markets, eh?


Viewing all articles
Browse latest Browse all 761

Trending Articles