The Good Life

Image result for shiny happy people

Let me be clear – this article is not about me telling you what to do in life. This article is not about turning people in to “the Borg” or whatever. I am describing a new concept for most people, whereas for “transhumanists” this idea would be fairly self-evident. Other people, say people like Alex Jones, will scream bloody murder reading about this idea – and maybe they should.

If you use a scheduler, or a calender, you use a tool. If you use an answering machine and automatic scheduler, again, you use a tool. A gantt chart? A tool. You submit to a self-imposed structure in your life. My inquiry here is, how far can we take this?

Taking myself as an example (and I am positive to goes for most people), I wasted a LOT of time in my life. Having some spectrum analogue of ADHD didn’t make things easier. Have had an emotionally disheveled mother and a hyper-aggressive sociopathic father didn’t help much. Like many people, I made some structurally stupid choices, and can easily argue society did it utmost best to aggravate these choices and make them worse in the worst possible manner.

Fast forward to deep learning algorithms.

We are now in an era where advanced learning systems can recognize the most gradient and subtle of patterns in all aspects reality. In the last five years machines are self-learning to perform tasks we would have regarded as manifestation of of some kind of magic ten years ago. Every day new examples emerge of things hitherto regarded as patently impossible.

AI equal with human experts in medical diagnosis, study finds

In a few years any new car can affordably have the ability to self drive. These systems are becoming so universally safe that to me it’s pretty certain that soon cities will simply legally disallow human drivers. A few decades at most, if that. That means soon you can’t drive your manually operate lamborghini in a city like London. You have to log in to ‘citygrid’ a system that steers you around traffic jams, manages your maximum speed, protects you from accidents, drives you sustainably and efficiently to your destination. Don’t have a car that’s automated, you are not allowed in. You are fined if you do. Something like this is pretty much certain, and fairly soon.

We can unleash a system analysis on any process, and come up with meaningful algorithms that do things better than any human could have ever come up with. You can show a machine a street scene and it can now tell you what’s happening in the scene. And the most scary thing is that most people are still blissfully unaware this is happening.

So what do I propose?

I really would like a suite of advisor software – a complex intelligent self learning algorithm that’s constantly updated. This happened to Tesla’s once – friday the cars could barely movement. During the weekend software was updated and the week after people started experimenting with driving their Tesla across the entire US continent.

Can we do this with human life?

I think we can, and that’s both scary and very exciting. Imagine this – suite of advisory systems that constantly monitors your actions, choices, goals, desires, needs, dangerous urges, whatnow – and you establish goal sets with your software. Like – you want to fitness more, and be able to run a certain distance at a certain speed. Or lose weight. Or move permanently to another country and live next to the sea.

The software would have looked at millions of volunteers, each establishing similar system constraints, and would have learned how to do things, how to avoid trouble, how to get things done, how to get the most enjoyment out of life, how to not burn out, how to maximize your longevity, how not to get diabetes or cancer, etcetera. Of course you would establish boundaries. If you insist on smoking, then the software ignores that, until certain extreme tresholds would be broken, based on warning signs you stipulate yourself. You can set your software any way you like, as long as the system won’t decide you are breaking crimes or seriously hurt yourself. The best software should help you to legally get the most healthy heroin for the least amount of money – if that’s what you desire. No dount there would eventually be management software geared to run criminal enterprises, and whatnot. Terrorism even. Religious extremism. Autocratic societies might run reward schemes if you have state approved “proper behavior” citizen management software.

The software would also be flat out honest to you, like – “no you won’t be an olympic athlete” – it would tell you certain goals, desires, expectations are unrealistic and to what degree. People might listen quicker to their own software advisor than their priest or their physician.

The implications of widespread voluntary use of this software would be profound, especially if the user could enter a somewhat meaningful dialogue with the system, query it, instruct it, steer its goal sets, etc. I would tell my software certain fitness goals to coach me into, I would tell my software how to not get myself in financial trouble, I would tell my software to “gently” coax me in to societally moral, pleasant, gregarious behavior that maximizes my life enjoyment and actually helps other people to flourish in my presence. I might even set my management software to subtly negotiate with the software of other people, ‘negotiate’ goal sets with other people’s goal sets, without showing my cards so to speak. I might even set my software to financially, legally, medically, educationally or politically represent me in a manner that best serves my needs.

Modern life has become extremely stressful and inhumanly complex for a lot of people. Most people would not want to be managed in the above manner, and that would be fine, but I can most certainly see people seriously need and want such software. I can even visualise psychiatric or parental specialty packages that allow therapists algorithms to help you maintain a sane life, or raise sane, pleasant and effective kids. I arguably would have been a much better human if my parents had had access to this software.