Weekend reading: the deadly consequences of unpredictable code

The Guardian’s end of August post-bank holiday/pre-Labor Day essay on how algorithms are morphing beyond the familiar if/then/else model we learned in coding school or in the IT engineers’ bullpen as you strained to understand how the device you sought to market actually worked is scary stuff, especially read in conjunction with the previous article about Click Here to Kill Everybody. We may be concerned with badly protected IoT, cybersecurity, and the AI Monster, but this is actually much nearer to fruition as it drives areas as diverse and close to us such as medicine, social media, and weapons systems.

The article explains in depth how code piled on code has created a data universe that no one really understands, is allowed to run itself, and can have disastrous consequences socially and in our personal safety. “Recent years have seen a more portentous and ambiguous meaning emerge, with the word “algorithm” taken to mean any large, complex decision-making software system; any means of taking an array of input – of data – and assessing it quickly, according to a given set of criteria (or “rules”).” Once an algorithm actually starts learning from their environment successfully, “we no longer know to any degree of certainty what its rules and parameters are. At which point we can’t be certain of how it will interact with other algorithms, the physical world, or us.”

What’s happening? Acceleration. What’s missing? Any kind of ethical standards or brakes on this careening car. A Must Read. Franken-algorithms: the deadly consequences of unpredictable code