I don’t know. Maybe we should.

authoritative science guy has an opinion

A scene from dollhouse episode 6 – a ‘cinematic scientist guy‘ (with black board with interesting sciencey terms and images behind him) asks a poignant question – once ‘certain’ technologies reach the high water mark and flood into the empty valley that is the yet unchallenged human ‘sacred‘ – this notion we have some kind of inalienable free will or selfdetermination, and that our mind is ‘real’ and something that transcends nature – once this comfortable fantasy abruptly shatters and it turns out humans are nothing more than a few hundred grams of ‘programmed matter’ (and badly at that) – then the technology will obliviate the sanctity of the human species – it will make us all (or those not in charge) redundant for whatever value system that just happens to come out on top. Humanity will be effectively scrap, with very low recycling value. The ones in control of this technology will be the posthuman directors of the script from that moment.

Infinite power corrupts infinitely, right?

Can we stop one of these technologies from emerging? Not indefinitely. It’s just a matter of time. What we need is to change our current world – our values – to reflect roughly the kind of world we may want. Call it an insurance – ask yourself how you want to be treated, minimally, and now infuse society with the values that everyone is treated at least that well.

Systematic insurances, and rock solid guarantees. Rights for every human, including the ones you’d strip of rights, the ones you regard as most loathsome. Because things change fast, and one day you may be one of a few billion humans, regarded as a plague on this planet.

And without rock solid insurances this revolution will erase you and all you and the generations before you worked to achieve will be wiped off the slate.

Tabula Rasa.

Nowe shouldn’t.