Re: Pitchforks And Torches Will No Longer Be Able To Stop The 1%

Posted: Tue May 07, 2019 10:05 am, #30
by MaureenCarter
I wholeheartedly agree with Jessica we need AI safety measures first to control the Technological Singularity. Even Ray Kurzweil promotes this approach of creating safeguards and standards in advance. This is only logical.

Singularity Hub
Ray Kurzweil: We Can Control AI Before It Controls Us
By Andrew J. O'Keefe II -Sep 22, 2016 ... rols-...
This cycle of excitement and worry is normal for new technologies. When a technology is first developed, we dream about the good it might do, and as it matures, we worry about unforeseen risks. Taking both sides into account, Kurzweil said he comes out cautiously optimistic. He acknowledges there's risk, and suggests the answer isn't to halt progress, but to plan for it.

By creating safeguards and standards in advance, we can better defend against negative consequences. As an example, Kurzweil points to the 1975 Asilomar Conference, a meeting that sought to define the ethical boundaries of biotech research before it reached its full potential. He believes a similar approach might work for AI and other exponential technologies.

As our technologies change the world, the responsibility only grows deeper for each of us to take an active role in shaping it the way we want not the other way around.
But logic many times is not the harbinger of what is to come. Instead I think greed and the desire for control, as in the incentive for businesses to profit and conservative brain structures to manipulate, will overrule any logic. This is the only logic that makes sense to me. History is awash with what should have been logical conclusions only to find out it wasn't so. As a case in point, just try understanding the logic of having Donald Trump as President of the USA. Now try to imagine what Trump and his allies will do to exploit the technological singularity for their own advantage regarding financial and control issues. Frightening isn't it.