At some point things change. I guarantee you that despite the claims of the simpletons that “there will be no new things under the sun” I insist there will be new things, there will many new things, all the rules will change and before this generation born has died of old age the world shall lay reshaped and torn apart in ways we won’t even have proper words for.
If accelerating technology and unintended consequences of technology, industry and chaos aren’t enough, the vile predatorial instincts of the human primate to exploit its fellow human beings all conspire to turn our world in a total failure. Yes that’s right, this world has a ‘natural’ potential and we are squandering it. A few centuries from now there could be trillions of humans, all living far richer
lives existences than we dare dream off, and we all are wasting it right in front of our very eyes.
This article is component of one of a series of talks protesting the sheer importance of our current era What we do now matters more than you can imagine. This isn’t just a period like any other in history – what we do or fail to do now is the actual deal-breaker for a miserable nightmare of a future or a pretty nice future. Or even complete and irreversible extinction of life and meaning on this planet and in this area in the galactic arm.
Let’s return to the topic at hand – the topic is the explosive escalation of technology, right under our very noses, that will lead up to either of four possibilities, which I shouldn’t repeat. I want option one – the perfect outcome, and ideally one where I survive indefinitely.
This article is discussed in Second Life. I invite to this conversation six people to attend – Amanda Stoel, Miriam Ji Jun, Rachel Haywire, Jason Patrick Schoenecker, Mike Anissimov, Eugen Leitl, Extropia DaSilva, Giulio Prisco, Ben Goertzel, Kevin Warwick and Ray Kurzweil. The conversations is tomorrow, sunday on the 17th of April 2011, and it will be here:
White Rabbit SLURL: http://slurl.com/secondlife/delinquent/160/192/503.
The time of the sequence of events leading up to white rabbit will be
Serendipity – Fulfillment – 09:30- 11:00 AM SLT (lasts generally about 90 minutes)
Bryce – Ideas of Things to Come – 11:00 – 12:00 AM SLT (lasts generally about an hour)
Khannea – FTWR – 12:00 -13:30 AM SLT (lasts generally 90 minutes)
The topic today is robotics, and in specific a very constrained niche of robotics., Try and visualise a niche of automated systems slowly emerging after 2015, that has the following characteristics –
1 – it isn’t as intelligent, in the strict “generic intelligence” sense as humans
2 – it is no longer strictly under control of humans
3 – it barters with humans or other similar (or dissimilar!) systems
4 – it engages in creative or lateral strategies and somehow succeeds in developing or generating new strategies
5 – it does not in any way have a psychological makeup that is derivative or exemplary of a human psychology
6 – it may move in distinctly different media.
Examples of these might be
(a) a botnet that escapes control by its creators (or its creators all “fall away”), is programmed to protect itself and perpetuate itself, hires (mechanical turk style) human operators for tasks only humans can do, while evolving or generating new approaches to increase its “utility” or “suvivability” or “marketability”.
(b) a military or corporate or industrial or financial system that was given some level of autonomy, experiences the end of the cause it was supporting, and then survives by working on or experimenting with permutations of the original set of instructions it had – i.e. it survives as long as it generates new goals.
(c) a dummy company that isn’t an A.I. at all, but is a complex feedback mechanism of holding companies, investment funds, foundations, attorneys, numbered bank accounts, set instructions, acting on behalf of (for example) a dead billionaire, perpetuating itself despite any of the interlocking transactions making sense anymore – BUT THE COLLECTIVE still growing, expanding, taking on power and defending its abstract interests.
(d) a category of self-repairing machine toys that are hacked, proceed to generate algorithmic hacking tools and subroutines that allow the individual constituents of the clade of devices to defends its core values, acquire value, spread its memes (breed), repair damage, subvert other systems, exchange in trade or barter with similar entities.
The question for today is – how far are we from this point. Arguably, our international corporate sector has already evolved distinct features of self-perpetuation, value perpetuation, competitor eradication, bartering and politics that fall squarely in this category, and it can be argued that increasingly the values systems of such corporate entities transcend the values and utilities of its human founders, benefactors and constituents. We could already say the same of criminal syndicates.
As automation increases will this dehumanization increase? Will there be certain tipping points along the way in this trajectory? Can we leverage this process to our benefit or should we be cautious about it? Where should we watch for escalatory warning signs?
Is the emergence of unintended automated systems in our ever more complex human societies a safe development, even LONG before they actually get smart?
[12:00:54] Khannea Suntzu: Hello there
[12:00:58] Deerstripe chair red (unpolished) whispers:
[12:01:07] Ivy Sunkiller: no difference I’m afraid Kimiko 🙂
[12:01:08] Khannea Suntzu: Interesting 🙂
[12:01:20] Ivy Sunkiller: hello there people popping out of nowhere
[12:01:40] Peer Infinity kisses Khannea 🙂
[12:01:41] Kimiko Yiyuan: Awww.
[12:01:58] Bogdan Ixtab: hello
[12:02:04] Khannea Suntzu: Yelp!
[12:02:07] Kimiko Yiyuan: Ok. I shall remain a mystery then.
[12:02:22] Khannea Suntzu: Hey Peer
[12:02:33] Khannea Suntzu: I was momentarily zoning away love
[12:02:50] Jilli (jiiiianne.sideways): hello : )
[12:02:57] Peer Infinity: those panties look uncomfortable. would you like me to take them off for you? 🙂
[12:03:00] Ivy Sunkiller: hello Immm 🙂
[12:03:23] Khannea Suntzu: Dont bite them off me, those teeth look sharp and pointy!
[12:03:34] Seren (serendipity.seraph): hey immm
[12:03:34] Peer Infinity: hehe, ok 🙂
[12:03:48] Khannea Suntzu: You re so.. white, peer!
[12:03:54] Bryce Galbraith: Hi all 🙂
[12:03:58] Peer Infinity: yes I am 🙂
[12:04:08] Android Neox: Are we listening to Myster Science Theater?
[12:04:21] Seren (serendipity.seraph): hey Hell!
[12:04:36] Hell Otsuka: Hi.
[12:04:39] Seren (serendipity.seraph): perched on a lampshade. 🙂
[12:07:45] Khannea Suntzu: Mew!!
[12:08:03] Khannea Suntzu: Hey HEY this makes it harder to intone myself!!
[12:08:13] Khannea Suntzu: Ok shall I start?
[12:08:13] Ivy Sunkiller: the only talk around where you can torture the host all you want 🙂
[12:08:16] Peer Infinity: is this better? 🙂
[12:08:24] Khannea Suntzu: *gasp!* yes
[12:08:33] Bryce Galbraith: lol!
[12:08:36] Metafire Horsley: ok, 5t4r7
[12:08:40] Khannea Suntzu: Hey I am nothing if not accomodating 🙂
[12:08:49] Seren (serendipity.seraph): laughs!
[12:08:54] Peer Infinity playfully traces my furry finger around Khannea’s belly button 🙂
[12:09:03] Seren (serendipity.seraph): khannea takes on all comers..
[12:10:01] Khannea Suntzu: Welcome y’all to this sunday meeting of “Follow the White Rabbit Down The Rabbit Hole”, a mostly weekly event with the purpose of freeform, lateral, low-taboo, low authority discussion of the potentially societally disruptive as well as potentially very rewarding technological (and other?) changes in the next decades.
[12:10:20] Khannea Suntzu: You can talk *all you want*
[12:10:27] Khannea Suntzu: Just rip loose
[12:10:39] Khannea Suntzu: I’ll make a few suggestions
[12:12:03] Drake (drake8889.steerpike): Did they get the reactors contained? I have not heard anymore about that.
[12:12:45] Khannea Suntzu: Topics of interest in these talks are (positives) nanotechnology, new media, new energy, augmented reality, virtual reality, simulation, robotics, cybernetics, nootropics, near-space industrialization, 3D printing, gaming, life extension, biogenetics, rejuvenation, artificial intelligence (and many other things) as well as (negatives) resouerce depletion, state mismanagement, the environmental debate, the left-right debate, the collapse of current state models, debts, the dollar problem, imperialism, corporatism, terrorism, state fascism, overtaxation (and many other things).
[12:13:08] Khannea Suntzu: This should not be a utopia forum. This should not be a dystopia forum. Ideally this should be a discussion forum about objective, detached, slighlty amused observers and educated commentators about global trends. Also, in all my talks I actively dissuade debate on “the Singularity“. While I favor the theoretical idea of “A” Singularity as a transitional event that would probably occur in some form between 2030 and 2050, we cannot say much about it since it implies (as yet) absolute unknowns so for practical reasons I choose not to debate it in this series of forums.
[12:13:25] Khannea Suntzu: Sing-U-La-Ri-Ty
[12:13:51] Bryce Galbraith: gotcha… 🙂
[12:13:55] Khannea Suntzu: Oh anyone hasnt seen Rays movie btw and wants to see it I can share it through dropbox. <a href="mailto:email@example.com"Email me.
[12:14:05] Khannea Suntzu: But aside from that…
[12:14:15] Khannea Suntzu: One set of important premises of my current series of forums is to look at the people worth considering – “My Kind Of People”. That may seem selfish or even elitist or cultist, but let me insist. The world is in my opinion not doing very well and insofar people agree, either most people are apathic about it, or are actively opposing finding solutions. I most certainly do NOT include greedy corporates, sell-outs, shills and enablers, criminals and gangsters, oligarchs, populists, demogogues and lying career politicians. I also have not a shred of patience for all those civilian simpletons out there glued to their television, consuming away all day, assuming like sheep it’ll all be the same more or less forever.
[12:15:07] Khannea Suntzu: Yah I am becoming a little more outspoken
[12:15:18] Khannea Suntzu: On that note did Rachel show up today?
[12:15:24] Khannea Suntzu: Rachel?
[12:15:30] Ivy Sunkiller: not that I know
[12:15:34] Khannea Suntzu: Oh darn it
[12:15:37] Khannea Suntzu: I advocate an activist militant counterculture of self-empowerment, lateral thinking, expediency, independence and (if need be) secessionism. I think our collective establishment us poisoned by self-interest and lack of ideals. Our political castes are sell-outs and hipocrites, and we need to get of our asses since nobody is going to do it for us. No I can say all these nice intense phrases and you will probably start feeling all warm and like pirate Jack Sparrow inside, about unless you do the following things it’ll be all much for nought.
[12:15:47] Seren (serendipity.seraph): do you consinder “greedy” as synonymous with corporate? How do you tell if a politician is lying? 🙂
[12:15:56] Khannea Suntzu giggles
[12:16:14] Ivy Sunkiller: Seren: politicians are lying by default, that’s their job
[12:16:19] Khannea Suntzu looks enigmatic at serendipity
[12:16:40] Seren (serendipity.seraph): but *rational* self interest is my ideal
[12:16:41] Khannea Suntzu: (1) you face up to reality and stop living in denial. (2) You learn to dtsitinguish facts from fiction and you make damnn sure you get information, good theories, proper tools, and you distill information from this, and then (3) you act on this information, face up to the fact that the majority of poeople out there do not (yet) have your interests at heart, and it’ll only get worse before it gets better. We live in an extraordinary era, and the consequences of this should be clear to anyone with half a brain – what’s worse, we live in a crucial transitional phase in history, a fact that should be clear if you are a little smarter and more educated that the common idiots. The consequences of this is that the more crucial this moment in time, the more significant everything stupid, superstitious, selfish, charitable, constructive or destructive done right now.
[12:17:20] Khannea Suntzu: Most here know – somewhere before 2050 in either of three ways (and maybe in a way I don’t envision yet) “we” will end up with a new nonhuman intelligence sharing this planet with us. Of this new intelligence we know only a few things. We can safely say once near-human general artificial intelligence emerges it will become a LOT smarter than average or even exemplary human intelligence and it will do so very fast. What happens next we know very little, other than that it can get very good, or very bad, and if we get the worst possible result may get very good for a small number of people and unspeakably horrendous for everyone else.
[12:17:40] Khannea Suntzu: Now who here thinks that is pretty unlikely?
[12:18:54] Bryce Galbraith: hmmm… by 2050?
[12:19:03] Khannea Suntzu: So everyone – literally everyone here – subscribes to the idea we can create or generate strong AI this century?
[12:19:06] Seren (serendipity.seraph): AGI is unlikely if we screw up badly enough. Otherwise I am pretty sure of it
[12:19:16] Ivy Sunkiller: 2050 matches Ray’s prediction of 2045, so yeah
[12:19:16] Hell Otsuka: expectedly approximately, Bryce.
[12:19:36] Khannea Suntzu: Non general AI a lot earlier
[12:19:48] Khannea Suntzu: General AI on par with humans, earlier
[12:19:52] Seren (serendipity.seraph): We have non general AI now. It is ubiquitous. I am married to one.
[12:20:10] Khannea Suntzu: stuff that will make us all unemployable, probably 2035
[12:20:16] Hell Otsuka: I do have noticeable doubts about possibility of an acceptable AI, but it doesn’t change much for me besides having significant interest in other possibilities.
[12:20:18] Peer Infinity: agreed. humanity might not survive long enough to create an AGI, but other than that, yeah, we’re likely to create strong AI this century.
[12:20:35] Khannea Suntzu: wow you are an easy crowd
[12:20:40] Bogdan Ixtab: we can get a global dictatorship with the stated goal of preventing AGI and technology advanceme nt in general; that would be the only likely scenario for stagnation in my view
[12:20:45] Khannea Suntzu: I wish I got laid that easy
[12:20:52] Hell Otsuka: Hello to `preachers to the converted`.
[12:20:56] Laserkitty Ling (laserhop.rothschild): bites lip
[12:21:00] Peer Infinity: um… afaik you do get laid that easy, Khannea 🙂
[12:21:03] Ivy Sunkiller: Bogdan: that or going back to stones and sticks era :p
[12:21:06] Khannea Suntzu: I state that in the long term for the next century or centuries, there are no other options, with my current understanding of science statistics than the followint – (1) we collapse into a more or less irreversible dark ages, comparable to a worldwide afghanistan, with very little progress, that may last us thousands of years (i.e. for all practical purposes forever) – (2) the human species (and probably most complex vertebrate life) goes extinct somewhere this century or the next century, either by a series of escalating existential catastrophes, or by being replaced by something manmade and ruthless), (3) we enter into a singularity state, which can be a range of literally hundreds of different technological hyper-surreal -development scenarios.
[12:21:23] Seren (serendipity.seraph): in 2035 my primary will be *egads* 80 and of doubtful employability barring anti-aging advances in any case.
[12:21:55] Khannea Suntzu: Life extensionh please gods, lets fret over that another day 🙂
[12:22:15] Bryce Galbraith: 🙂
[12:22:19] Seren (serendipity.seraph): we could easily lose the tech basis to create AGI or nanotech or even tread water where we are.
[12:22:19] Hell Otsuka: Bogdan, it doesn’t have to be dictatorship that much; rather, if no obvious way for making an AGI friendly is found, it will be paranoidally banned.
[12:22:23] Khannea Suntzu: So three scenarios…..
[12:22:26] Bryce Galbraith: Yes, clinical immortality is another topic …. 🙂
[12:22:32] Khannea Suntzu: we become cavepeople (mad max world, i.e. afghanistan everywhere)
[12:22:37] Khannea Suntzu: we all die horribly before that
[12:22:44] Khannea Suntzu: or there is singularity and some or most humans survive, though possibly in another form.
[12:22:56] Khannea Suntzu: you guys see alternatives happen before say, 2200?
[12:23:02] Khannea Suntzu: A, B or C
[12:23:19] Khannea Suntzu: aliens from reta riticuli?
[12:23:27] Peer Infinity: Serendipity just described scenario D
[12:23:40] Khannea Suntzu: (red: Ok… so Serendipity subscribes to a ‘steady state scenario’ of treading water. I do not believe in that at all – I think we will have a resource collapse and massive wars before that)
[12:23:46] Ivy Sunkiller: Moties? :p
[12:23:53] Khannea Suntzu: Heh
[12:23:57] Bogdan Ixtab: hell, if the state is not totalitarian, there will always be powerfull underground movements that may develop their own AGI which will then take over
[12:24:18] Khannea Suntzu: In all these cases the lynchpin is the creation of something that overcomes problems and challenges smarter than a human being. Or faster. Or simply better. Or in a radically different manner. We always assume this is artificial intelligence.
[12:24:29] Peer Infinity: and two other people described scenario E
[12:24:32] Ivy Sunkiller: ah underground of AGI, how fascinating
[12:24:37] Hell Otsuka: Bogdan: … which will then be “we all die” scenario, indeed.
[12:24:39] Seren (serendipity.seraph): with tech enabled universal surveillance it will become much harder to have and effective underground
[12:24:51] Khannea Suntzu: Garage AI, oh at some point it will become ‘easier and easier’
[12:24:56] Ivy Sunkiller: I for once welcome our new robot overlords!
[12:25:02] Bryce Galbraith: 🙂
[12:25:06] Khannea Suntzu: Eliezer Yudkowsky had a nice quote on that
[12:25:07] Bogdan Ixtab: yeah, pretty much
[12:25:41] Seren (serendipity.seraph): I am so disappointed in “my poor humans” that I would welcome something better.
[12:25:41] Khannea Suntzu: Today I’d like to look at features in several distinct domains. This may get a little abstract so bear with me. The domains are: (a) the realm of evolutionary adaptation in life; (b) the realm of virtual reality, as well as all realms of games, game design, simulations; (c) the realms of financial and legal constructs; (d) the realm of metaphysics, magic, spirituality and other allegedly “fictional” metaphors.
[12:26:17] Khannea Suntzu: Yes you heard that correctly
[12:26:32] Khannea Suntzu: I included fictional shit, as a brainstornming tool
[12:26:43] Seren (serendipity.seraph): you don’t give over-limit you area, do ya? 🙂
[12:26:54] Peer Infinity: I think the quote was something like “Every 18 months, the IQ required to exterminate humanity drops by 2 points”
[12:27:01] Khannea Suntzu: Gibson did so too, with his voodoo AI. It’s all about understandable metaphors.
[12:27:04] Rhiannon of the Birds (rhiannon.dragoone): Peer, lol
[12:27:08] Lolle Edenbaum: Says who @Peer?
[12:27:09] Khannea Suntzu: Yah that was the quote 🙂
[12:27:16] Khannea Suntzu: Elizer
[12:27:27] Khannea Suntzu: In these (and many other) niches we can postulate beings (..animals) that have goals, try to survive, try to defend, tro to procreate or at least pass on traits, engage in politics, engage in accumulation if wealth, control the environment. In the above environments aninmals do it, computer viruses do it, governments do it, banks do it, demons and angels do it, gods do it. Fictional entities doesn’t make much difference – virtual realities are fictional anyways – the argument I am making is that we can postulate several overlapping arenas in the REAL world of scarcities and competition, specifically ‘cyberspace’,
‘the financial sector’, and ‘the physical world’ where soon we will be facing (and we may be argued to be already be implictly in competition with) nonhuman consumers.
[12:28:03] Khannea Suntzu: And – they dont need to be smart yet
[12:28:14] Khannea Suntzu: case in point – US presidents
[12:28:31] Khannea Suntzu: The problem is this:
[12:28:33] Khannea Suntzu: Take for instance this quote: “Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed.”
[12:28:35] Seren (serendipity.seraph): sure. try playing poker or chess agains even a modest program..
[12:29:24] Seren (serendipity.seraph): or to put it another way, a theft from anything productive you could have done with those resources
[12:29:24] Khannea Suntzu: So what is it about military expenditures that compells a state to spend more and more collective (societal?) scarce resources oh a pursuit that can ih a way be argued to be an irrational waste. Let’s try and cal this the X factor. The X factor would be the critical reason that compells humans, sometimes in the face if reason, to waste resources on irrationalities. It might be fear, it might be a system that evolved out iof bounds, it might be systemic corruption. Whatever the causator of X is, or the operating mechanism of X is, let’s summarize X is the factor that detracts resources from humans and wastes it on something else, on behalf of some kind of error, or a selfperpetuating parasytic feedback loop.
[12:29:58] Metafire Horsley: Ah, I have to remember that … *human life is not productive* ^^
[12:30:11] Seren (serendipity.seraph): most states don’t spend more and more on this. only empire stage states generally do or ones feeling very threatened
[12:30:18] Rhiannon of the Birds (rhiannon.dragoone): hi kimiko
[12:30:44] Seren (serendipity.seraph): the US spends a far larger part of its budget on “defense” than any other nation
[12:30:54] Khannea Suntzu: Can you guys give me an example of really stupid waste of valuable resources on absolute bullshit that isn’;t even fun (othe than war of course) that waste monumental resources.
[12:31:04] Khannea Suntzu: An yes religions is a bit easy too
[12:31:16] Seren (serendipity.seraph): who said that, meta?
[12:31:22] Ivy Sunkiller: Khani: biofuels?
[12:31:28] Bryce Galbraith: Hi Rhi 🙂
[12:31:52] Metafire Horsley: I thought that was the gist of what you were saying, Seren.
[12:32:02] Khannea Suntzu: Interesting Ivy. Is the idea if Biofuels, a monumental and total parasytic waste oif collective resources?
[12:32:15] Khannea Suntzu: A total catastrophe?
[12:32:16] Seren (serendipity.seraph): uh, paying for an ever-growing parasitical government isn’t my idea of fun.
[12:32:23] Ivy Sunkiller: well it does take more oil to produce biofuels than you get from it 🙂
[12:32:41] Peer Infinity: this is kidna off-topic now, but I remember seeing a LJ post by someone who argued that since it costs about $800 to save the life of an african child, we can refer to the amount of $800 as “one dead child”. so, you could say, for example, “this car costs 20 dead children”…
[12:32:44] Hell Otsuka: K, every resource expenditure has *some* meaning; at most it might be staggeringly ineffective.
[12:32:56] Seren (serendipity.seraph): not at all meta. I said the resources used for war could be used much more productively or that was what was meant
[12:33:14] Khannea Suntzu: The point is
[12:33:19] Khannea Suntzu: My argument is
[12:33:20] Rhiannon of the Birds (rhiannon.dragoone): So anyone mind sharing the topic with the entertainment?
[12:33:28] Rhiannon of the Birds (rhiannon.dragoone): Are we still on how to solve the energy crisis?
[12:33:36] Seren (serendipity.seraph): well that is really gruesome and unuseful way to put anything peer
[12:33:52] Khannea Suntzu: That there can be complex systems that can start living a life of their own, despite all commnon sense, and waste resources…. Systemic parasites….
[12:33:59] Seren (serendipity.seraph): those african children have no claims on my resources whatsoever
[12:34:07] Bogdan Ixtab: nothing is completely parasitic/usless, example: military usually encourages a lot tech advances
[12:34:15] Seren (serendipity.seraph): nor does anyone else unless I agree
[12:34:15] Khannea Suntzu: Industrial, technological, governmental, mjilitary parasitical waste machines
[12:34:27] Khannea Suntzu: We humans have always have exhibited signs of irrational faith in intangibles. I’d argue that we aren’t as smart as we like to think we are – 1350 grams of brain isn’t *that* much thinking matter. So with a relatively small computational tool we must somehow make sense of a reality that is far too comple to do so. To make sense of an excessively complex world we outsourced a shared apparatus of understanding, or resolving debates on what istrue or false. We only very recently, one can argue just decades or years ago, evolved a real mechanism of determining nonsense from fact, but this traces back along the path of enlightenment, scientific method, etc. We are doing better and better, but nearly quick enough, as our world explodes in conmplexity, and thje vast majority of our co-humans aree still not using proper cognitive tools. In other words – mpst humans haven’t caught up yet, and since the laggards DO have a vote, we face a near unsustaiable if not catastrophic situation that our society is vulnerable to…
[12:34:27] Khannea Suntzu: all minds of systemic exploits, cognitive frailties, collective superstitions, etc.
[12:34:30] Rhiannon of the Birds (rhiannon.dragoone): Bog, true but sometimes the distortions worsen the results
[12:34:39] Rhiannon of the Birds (rhiannon.dragoone): hi Veronica!
[12:34:43] Bryce Galbraith: Hi Veronica
[12:34:48] Seren (serendipity.seraph): that tech advantages happen for weapons of war does not justify all that money going to weapons of war
[12:35:08] Veronica Christenson: Hello again Rhiannon and Bryce
[12:35:39] Khannea Suntzu: We are as a species getting somewhat better at deciding what is bullshit
[12:35:48] Rhiannon of the Birds (rhiannon.dragoone): By the same token, Khannea, we were supposed to run out of oil in the 30’s; the bufallo was supposed to be extinct in the 20’s; we underestimate human ingenuity. You should read “The Bountiful Earth,” which argues that human genius, unleashed, will provide.
[12:35:49] Khannea Suntzu: we dont build pyramid anymnore
[12:35:53] Veronica Christenson: why do I feel like i am in Gor… ~shivers~
[12:35:55] Ivy Sunkiller: as fast as we produce that bullshit K? 😀
[12:35:59] Seren (serendipity.seraph): our evolved psychology and intelligence is less and less sufficient as the world complexifies faster and faster
[12:36:02] Khannea Suntzu: That WAS a bit of a bad idea, Pyramids.
[12:36:04] Seren (serendipity.seraph): is the gist of it
[12:36:17] Rhiannon of the Birds (rhiannon.dragoone): No, “Their resourceful earth”
[12:36:20] Hell Otsuka: I have a hypothesis that most of thinking inefficiences arise from obsolete memes we all cling to. *(Red: link: )
[12:36:23] Khannea Suntzu: But so many people are still systemically clueless
[12:36:56] Seren (serendipity.seraph): please show me how we will unleash it, rhi. I agree but it would be great to have the means in hand to do so.
[12:36:56] Ivy Sunkiller: ups, sorry Rhi if that kicked you off
[12:37:10] Khannea Suntzu: Now look at the examples I gave on my blog in somewehat more detail. Feel free to discuss each example, and try and give angles and variations on these scenarios. Feel free to deconstruct these, or demonize or cry wolf about them.
[12:37:13] Rhiannon of the Birds (rhiannon.dragoone): I was wondering, Ivy
[12:37:25] Khannea Suntzu: Now I need you peeps to think along
[12:37:33] Bryce Galbraith: Ah, okay… The Resourceful Earth…
[12:37:37] Khannea Suntzu: This is where the rubber hits the road so to speak
[12:37:49] Rhiannon of the Birds (rhiannon.dragoone): You’ve read it, Bryce?
[12:37:49] Khannea Suntzu: Scenario one: In 2013 a number of hackers in service of a major illicit banking kartel face immediate arrest by interpol and the CIA. They run a botnet, and they do not trust their sponsors (the banks), or each other, so they program the botnet to be self-hiding. The systejm works through subsidiaries that do not realize what they are working for. The Botnet is not intelligent but it is programmed to quickly change strategies, bank accounts, severs. In 2013 it owns 20 million dollars in a large number of bank accounts. The system analyzes economic data and decides that it does not want to hold dollars anymore, and it reinvests in minerals and gold. After a hypothetical crash in 2015 thesystem is suddenly twenty times as “rich”. It hires human hackers and goons to do its bidding. It learns that it is most expedient to kill these hackers by poisoning them after using them for these assignments. By doing so the system keeps getting progressively richer and richer and it finds it gets more resilient and robust by getting specific types of servers, in specific countries. It also is able to effectively model how to evade escaping notice from international legal entities OR in some cases act as “benefactor” or “facilitator” to law enforcement (that strange guardian angel) with gentle nudges here and there. Even in 2022 the system is not intelligent by any measure even if it now occasionally does call people and leaves voice messages with detailed (but sometimes slighly odd) instructions.
[12:37:53] Seren (serendipity.seraph): actually our evolved psychology is very problematic for living in the future that is coming and partially here now
[12:38:31] Bryce Galbraith: No, haven’t read it Rhi… was just trying to look it up and at first I was in my browser window searching for Bountiful Earth and didn’t get any hits…
[12:38:34] Rhiannon of the Birds (rhiannon.dragoone): hi Cousin!
[12:38:49] Khannea Suntzu: Emphasis – this hypothetical botnet is NOT inteligent at all
[12:38:50] Bryce Galbraith: That’s when I flipped back and saw you put up corrected title 🙂
[12:38:53] Veronica Christenson: …okay… so where is the handsome male dancer for the woman?
[12:38:57] Seren (serendipity.seraph): reminds me of the Saurez books that are highly recommended. Daemon and Freedom
[12:39:14] Ivy Sunkiller: Veronica: I’m trying to convince Arisia, it’s hard
[12:39:21] Khannea Suntzu: Any comments on Scenario one ? Feel free to discuss. YES this is a somewhat contrived scenario. Its fictional and not very likely. YES all these conputer people will be in stitches over my ignorance. But bear with me. Can hardware/software specialists give me some creatie examples on how systems like these COULD operate with some degree of independence at some point in the future?
[12:39:29] Arisia Vita: me? horrors!!!
[12:39:57] Bryce Galbraith: I started in on Daemon… it’s fairly well done but somehow I can’t quite get into it. Not sure why. I give him a lot of credit for getting his technical details straight though.
[12:39:59] Khannea Suntzu: Discuss 🙂
[12:40:03] Rhiannon of the Birds (rhiannon.dragoone): Um. Khannea, what was scenario one? Send it to me in IM, please
[12:40:12] Bogdan Ixtab: computer viruses are examples of independent non-intelligent entities
[12:40:22] Peer Infinity: not intelligent? the behaviour you just described sounds kinda intelligent to me. did you mean “not sentient”? or “not generally intelligent”?
[12:40:31] Rhiannon of the Birds (rhiannon.dragoone): The basic thesis is Bryce, the resources are there; we just have to get to them. It’s like Wyat’s Torch in Atlas Shrugged.
[12:40:37] Bogdan Ixtab: not generally intelligent I assume
[12:40:39] Seren (serendipity.seraph): It is very likely actually, under some scenario
[12:40:46] Khannea Suntzu: About as smart as a strategy game 🙂
[12:41:12] Seren (serendipity.seraph): botnets are large and sophisticated today. what you are talking about is simple adaptive programming. hard to debug and predict even if you wrote it.
[12:41:30] Veronica Christenson: wonders where the maid is who is going to clean up the blood on the floor
[12:41:33] Seren (serendipity.seraph): to many branching possibilities.
[12:41:42] Rhiannon of the Birds (rhiannon.dragoone): Khannea, Scenario one is like “When the Sleeper Awakes,” only digital.
[12:41:51] Khannea Suntzu: We have an Alice in a Cage veronica
[12:41:59] Rhiannon of the Birds (rhiannon.dragoone): Very plausible, once you grant certain assumptions about the concentration of capital
[12:42:08] Veronica Christenson: goodie
[12:42:41] Bryce Galbraith: Scenario 1 reminds me of Keyser Soze as an AI 🙂
[12:42:43] Khannea Suntzu: As time progreses
[12:42:58] Rhiannon of the Birds (rhiannon.dragoone): Veronica, don’t give them ideas; they might make me rez my bucket
[12:43:08] Khannea Suntzu: A phenomenon like this become nore likely….
[12:43:20] Rhiannon of the Birds (rhiannon.dragoone): Yeah, we have the AI taking the place of the Sleeper in the Wells story
[12:43:20] Seren (serendipity.seraph): I have my suspicion that many botnets are actually run by the government to sieze control of the population’s computational and communication resources in “an emergency”
[12:43:28] Lolle Edenbaum: @ Hell otsuka could you explain your hypothesis please?
[12:43:28] Bryce Galbraith: Thanks for the quick summary on Resourceful Earth too Rhi.
[12:44:03] Rhiannon of the Birds (rhiannon.dragoone): Seren, but the trouble is the government has arranged the internet so that it is uncontrollable; all attempts to contro it will fail. The geenie is out of the bottle.
[12:44:13] Rhiannon of the Birds (rhiannon.dragoone): ty, Bryce
[12:44:15] Veronica Christenson: the government arranged that?
[12:44:38] Rhiannon of the Birds (rhiannon.dragoone): Vernoica, yes; the internet was supposed to be the post apocalyptic communications system
[12:44:41] Cousin Hermit: The Government did NOT invent the internet, hackers did.
[12:44:43] Rhiannon of the Birds (rhiannon.dragoone): *genie
[12:44:53] Veronica Christenson: I cannot imagine the government intentionally arranging anything that they could not control
[12:44:54] Bogdan Ixtab: yes, it was designed to survive nuclear war
[12:45:02] Cousin Hermit: no
[12:45:19] Seren (serendipity.seraph): why not? they now claim the legal right to strip anyone of computers at the border and to invade private citizen’s computers by means well-known by hackers and to do so without warrants or suspicion
[12:45:19] Veronica Christenson: I thought the geeks were responsible for the internet
[12:45:22] Rhiannon of the Birds (rhiannon.dragoone): Veronica, the idea was that an enemy could not control it and that the poulation would then receive government information.
[12:45:32] Hell Otsuka: Veronica, they arranged it so that it would survive anything; it would survive attempts to control too, conincidentally.
[12:45:45] Seren (serendipity.seraph): the power of modern computational tools, mobile phones and communications scares governments
[12:45:45] Veronica Christenson: Who exactly are they?
[12:45:56] Khannea Suntzu: Hmm yummy *they*
[12:46:05] Cousin Hermit: The “Corporatocracy” is NOT the government.
[12:46:06] Hell Otsuka: Veronica: govermentally-funded hackers-engineers.
[12:46:08] Veronica Christenson: The USA in particular, or a world wide governing body
[12:46:09] Rhiannon of the Birds (rhiannon.dragoone): Veronica, the DOD
[12:46:20] Seren (serendipity.seraph): geeks are responsible for the internet.
[12:46:29] Veronica Christenson: that is what I heard also Seren
[12:46:30] Bogdan Ixtab: the protocols that are the foundation of internet were developed by DARPA in the 70’s
[12:46:32] Rhiannon of the Birds (rhiannon.dragoone): Ultimately, Seren.
[12:46:38] Seren (serendipity.seraph): but many worked for the government in the beginning.
[12:46:40] Rhiannon of the Birds (rhiannon.dragoone): But it was DOD geeks
[12:46:41] Cousin Hermit: Intelligent people can do what ever they want regardless of any attempt to control them.
[12:46:54] Cousin Hermit: No Rhia.
[12:47:13] Rhiannon of the Birds (rhiannon.dragoone): Exactly, Cousin; there is an attempt to put all IP’s through one server; heh, one off shore hacker can stop that.
[12:47:17] Ivy Sunkiller: Cousin: well regardless of her intellect Khani can’t, say, move in her position atm *chuckles*
[12:47:23] Seren (serendipity.seraph): no. the internet as it is usually used as synonymous with the web only came into being in 1991 or so.
[12:47:37] Seren (serendipity.seraph): the internet based on TCP-IP came from IETF work
[12:47:47] Veronica Christenson: what is IETF?
[12:47:48] Hell Otsuka: … regarding scenario 1: I’m not sure how complex such system would have to be (on a scale between modern AI attempts and human-level AGI).
[12:47:52] Khannea Suntzu: Earlier today a someone was sitting here at the meeting at Serendipities, like an office away where the guys INVENTED darpanet. Zhe dreank coffee with em.
[12:47:57] Bryce Galbraith: IETF = Internet Engineering Task Force
[12:47:58] Seren (serendipity.seraph): earlier networks like DARPA net came from government and universities
[12:48:13] Veronica Christenson: And who funded that group Bryce?
[12:48:21] Rhiannon of the Birds (rhiannon.dragoone): ok, Khannea, scenario 2, please, for the enterstainment
[12:48:21] Seren (serendipity.seraph): it is an grass roots group
[12:48:24] Seren (serendipity.seraph): of geeks
[12:48:30] Bryce Galbraith: It’s pretty much volunteers Veronica.
[12:48:31] Cousin Hermit: The point is that smart people can do good things regardless of any controlling factor.
[12:48:34] Khannea Suntzu: Ok 🙂
[12:48:40] Shiki Scarlet (jacen.helix) is Offline
[12:48:40] Khannea Suntzu: Scenario two
[12:48:41] Rhiannon of the Birds (rhiannon.dragoone): Seren, I know, it was started by Al Gore
[12:48:42] Khannea Suntzu: Scenario two: An angry billionaire is locked out of the US, the EU and cannot get help as he is slowly dying of a wasting disease. He is angry and concocts vengeance. He creates a series of dummy companies, legal firms that follow sets of very precise, independently very reasonable instructions that, when interacting, constantly conspire to destabilize certain markets. The creation is legal and does not violate laws. However after the billionaire’s death the instructions take on a life of its own since they involve the creation of new companies, new goals, new targets, new strategies. After a few years the interlocking web of transactors do things the original creator would have never envisioned, intended, believed or anticipated. Also, it’s far bigger, and none of the component parts of the ‘conspiracy’ have a clue what they are doing anymore.
[12:48:42] Veronica Christenson: so not a Government?
[12:48:47] Ivy Sunkiller: Cousin: +1 🙂
[12:48:54] Bryce Galbraith: Or, it’s technical follks who had day jobs working different places, but in their off hours they worked on IETF projects too.
[12:49:08] Bryce Galbraith: So you might say that they are indirectly supported by companies and governments.
[12:49:17] Veronica Christenson: maybe
[12:49:22] Seren (serendipity.seraph): sounds largely like Daemon, khannea
[12:49:43] Bryce Galbraith: I’ve been reading Tim Berners-Lee’s book ‘Weaving the Web’ a bit too, and it is interesting in how he describes developing the first web server.
[12:49:46] Khannea Suntzu: No emphasis on automated systems this time
[12:50:02] Rhiannon of the Birds (rhiannon.dragoone): Well, that could happen, I suppose; hey, I think that would make a good spy thriller; thanks for the idea, Khannea
[12:50:07] Bryce Galbraith: A lot of these ideas have been floating around for a while, and in fact come up multiple times until somebody manages to put them together.
[12:50:31] Seren (serendipity.seraph): an actuall disgruntled billionaire would set up a lot of automation to do the work and monitor it
[12:50:41] Metafire Horsley: That idea sounds a bit like what happened to religions. Speak unintended consequences.
[12:50:46] Khannea Suntzu: It would be easier WITH automation
[12:50:49] Rhiannon of the Birds (rhiannon.dragoone): Well, wikileaks could have been a spy thriller; with James Bond sent out tostop the problem, or maybe Steed and Emma Peel, as it sounds more like their cup of tea
[12:51:01] Cousin Hermit: Corporations are such automations which should be stopped.
[12:51:18] Khannea Suntzu: But the pont is you can make people act like marionettes by binding them with laws
[12:51:21] Peer Infinity: so far both of these scenarios seem to require the AI to be generally intelligent. In both scenarios, the programmer wouldn’t be able to think of all of the unexpected scenarios the AI would have to adapt to, and so the AI would have to do at least some learning and some general problem solving on its own. so you might as well call it an AGI.
[12:51:28] Khannea Suntzu: and money
[12:51:35] Veronica Christenson: I was on a team to assist small communities to bring the internet into their lives… at that time, it was thought, that advertisements would not be possible via the internet, because anyone advertizing would be flamed and thus shut down…. Boy were they wrong about that one
[12:51:56] Peer Infinity: or, if the AI is not generally intelligent, then the plan will go horribly wrong as soon as something unexpected happens.
[12:52:19] Bryce Galbraith: Interesting Veronica 🙂 Advertising is such a huge thing online now…
[12:52:24] Veronica Christenson: yes
[12:52:26] Seren (serendipity.seraph): actually no. in Daemon the automation is not an AGI at all.
[12:52:30] Lolle Edenbaum: So far both scenarios are also based on our fniancial system that invlolves money (or any other “token currency”)
[12:52:32] Khannea Suntzu: Peer right now these two scenarios are inplausigble in 2011… and in 2050 AGI seems almost inescapable…. so where do the two meet halfway?
[12:52:32] Cousin Hermit: This is why we have the concept of “The Black Swan” … http://www.youtube.com/watch?v=kqvPpQKsYoA
[12:52:35] Rhiannon of the Birds (rhiannon.dragoone): hmm, all this reminds me of the Koontz book where a linden like corporation creates an AI in a small town near Silicon Valley; it has access to all the information on the internet and goes ‘Jeeprs, humans kill beiings like me,’ and takes defensive action.
[12:52:41] Seren (serendipity.seraph): but far more extensive than these two scenarios
[12:53:11] Veronica Christenson: when money is an outcome, there is little that can stop what is necessary to gain it
[12:53:48] Seren (serendipity.seraph): there can be no technological world short of full machine level nanotechnology without money – without fungible wealth tokens
[12:53:54] Khannea Suntzu: Scenario three is a great deal weirder
[12:53:59] Lolle Edenbaum: Based on money being an objective , yes.
[12:54:09] Khannea Suntzu: Scenario three: A genius creates a software utility that is absoolutely great in seklforganizing logistical databases. Then the genius goes mad and his conpany goes bankrupt. But everybody keeps using his software. What nobody knows is that the software has a backdoor and several errors, which after a few years break. The software tries to correct itself, but since the utility runs oh thousands of servers, and logistical systems worldwide, and nobody really cares what uit does, as long as it works well, they don’t care that some of the devices fall of the grid now and then. They have an abstract goal intent on protecting the integrity of the database structure, and to do that they do whatever they can to take over small bits of cloud computing space here..and there… and it adds up. Then whenh the whole world starts automating faster and faster in the 2020s, the system starts eating up resources, and more and more robots go rogue, often with no duistinct purpose, but occasionally with very distinct goals. ESPECIALLY if the system can hijack infrastructure with 3D printers – it has very good use for THAT 🙂 … Then in 2023 this ephemeral network starts BARTERING its excess capacity with other similar low-intelligent infomorphs for abstract, often very hard to pinpoint utilities and services. These interactions would then be characterized as “instinctive” engagements.
[12:54:16] Seren (serendipity.seraph): they are both plausible starting now.
[12:55:08] Kimiko Yiyuan: That sounds like a variation of the same story of the book that was already mentioned.
[12:55:16] Khannea Suntzu: One difference
[12:55:28] Veronica Christenson: Suddenly I felt as if I was a player in a game of clue…… must be the surroundings here
[12:55:35] Hell Otsuka: … and S3 also sounds like an implausible SF story.
[12:55:38] Khannea Suntzu: It engages in resources that do not *immediately* compete with human resources
[12:55:41] Cousin Hermit: Yes, 3-D Printers take AI to a new level.
[12:55:51] Seren (serendipity.seraph): you know there are multi-agent systems now that use economic bartering to decide what to do and what resources to use to do it, yes?
[12:56:17] Seren (serendipity.seraph): this is not far-fetched technically at all
[12:56:25] Khannea Suntzu: And it engages in barter with other similar systems ih abstract assets that would not be regarded as something ‘misappropriated’…
[12:56:44] Khannea Suntzu: Yep
[12:56:47] Khannea Suntzu: Logistics
[12:57:08] Peer Infinity: ooh, that’s an interesting question, Khannea (“where do the two meet halfway”). I’m surprised I hadn’t really thought of that much before. I was assuming that as soon as you have something that’s almost an AGI, it would take only a little bit more effort to make it into an actual AGI. And the AI could help with the process. And then there’s the detail that in the scenarios you described, if you give tha AI a specific goal to achieve, it is likely to figure out on its own that becoming more intelligent will help it achieve its goal. and so you end up getting a Singularity even in these scenarios.
[12:57:29] Seren (serendipity.seraph): monetary systems behind much investing and the huge huge area of currency swaps are all highly automated traders
[12:57:33] Khannea Suntzu: Nopes Peer
[12:57:36] Seren (serendipity.seraph): literally trading machines
[12:57:41] Khannea Suntzu: I wanted to avoid that in the discussion as such… but eventually sure, we’ll have one from girlscout cookies.
[12:57:52] Khannea Suntzu: The assymption is these systems do have initial values…
[12:58:11] Ivy Sunkiller: heya Arch
[12:58:11] Seren (serendipity.seraph): cool. It meets V. Love it!
[12:58:14] Peer Infinity: so the programmer explicitly programs the AI to not try to get smarter?
[12:58:19] Bogdan Ixtab: what do you mean by initial values? Initial goals?
[12:58:21] Khannea Suntzu: Or the values are irrational and they quickly transcend them to survive
[12:58:21] Kimiko Yiyuan: Is there also a scenario where a cyberweapon, a kind of hyperdeveloped Stuxnet thing, is sned to, say stop or halt a specific research project, is programmed to evolve so it can adapt to countermeasures and then goes loco and starts to well…starts to traget systems it was not intended to?
[12:58:37] Kimiko Yiyuan: target
[12:58:42] Khannea Suntzu: Thats my fourth
[12:58:50] Khannea Suntzu: hold on 🙂
[12:59:04] Rhiannon of the Birds (rhiannon.dragoone): hi Arch!
[12:59:17] Seren (serendipity.seraph): our network of computational systems does things all the time they aren’t officially supposed to.
[12:59:20] Ivy Sunkiller: Khani: you have a bush between your legs!
[12:59:29] Khannea Suntzu: You have these nonintelligent systems and they do stuff because they are programnmed to do so
[12:59:46] Seren (serendipity.seraph): our abilities to model the interacitons of thousands to millions of computational entities is extremely limited
[12:59:48] Khannea Suntzu: But then they generate new goals procedurally
[13:00:03] Khannea Suntzu: ..need…more…cooling…stacks…
[13:00:09] Cousin Hermit: Still, it only takes a few intelligent people to actually “do” something to change things. That is why “they” try to make it sound like truly intelligent people are retarded and give them drugs to inhibit their super intellects.
[13:00:24] Khannea Suntzu: We call that instinct right?
[13:00:34] Lolle Edenbaum: Well, let us define “intelligence” and while we are at it, none of these scenarios seem to involve complex intelligent strategies, they all are based on us becoming more and more dolls to ourthe SYSTEMS we put in place to arrange things more easily.
[13:00:42] Seren (serendipity.seraph): what is procedurally though? It is some algorithm. We generated goals procedurally also. Just more squishy biological algorithms
[13:00:47] Rhiannon of the Birds (rhiannon.dragoone): Cousin, yeah, but “they” are closer than you think; there’s a sim that bans you if you’re too intelligent.
[13:00:49] Bogdan Ixtab: If they don’t have a stable “ultimate goal set”, they will fail ultimately
[13:01:01] Rhiannon of the Birds (rhiannon.dragoone): It’s banned 3 people because the rumor had it that they were brilliant thinkers
[13:01:10] Ivy Sunkiller: do humans have ultimate goal set? 🙂
[13:01:19] Ivy Sunkiller: aside from exploiting other humans that is
[13:01:21] Khannea Suntzu: I do 🙂
[13:01:21] Rhiannon of the Birds (rhiannon.dragoone): Ivy, why to serve God, of course
[13:01:23] Seren (serendipity.seraph): yes. in bios madness and intelligence are not that far apart of course
[13:01:26] Arch (archmage.atlantis): Banning is only a roadblock
[13:01:31] Ivy Sunkiller: oh yes Rhi, I forgot!
[13:01:42] Khannea Suntzu: Scenario four: In 2020 a major middle east country collapses after the US withdraws military support. The country was using a black market combination of independently operating military robotics. These things can survive in the fields as snipers, minelayers. anti-personal systems, sentinel guns, drones for years in the field. Some are dirt cheap. Others are dug in, use 3D printers for spare parts. It all operates on solar energy. Then three years later the system decides that everyone in the area is a threat. Nobody cares and nobody believes the concerns of the local population. Everyone blames the Al Qaida DuJour for several years. After four years the death tally is half a million people and the system has started trasacting new assets – it downloaded parts to make nerve gas from simnple components, it is taking over local factories anhd it is systematically using nerve gas, scorched earth tactics, destroying croplands irreversibly. By 2028 the satelite maps show something is really wrong but the strategists have no clue howto get rid of thos entrenched problem. The devices are everywhere, dug in deep like mines and many of them mobile, smart, very creative, very flexible, constantly downloading ideas from the internet. It isn’t as sexy as a Skynet, and it won’t take over the world (and it is not as smart as a rat) but it spreads in the specific desertified, depopulated, poor regions.
[13:02:06] Kimiko Yiyuan: Good, so then we do have all the most popular scenarios in current scifi listed, I think. Or probably they are not that current, but rather old. I have not read that many scifi literature.
[13:02:14] Bogdan Ixtab: yes, humans have stable goal set – survival
[13:02:24] Cousin Hermit: As soon as you have a “plan” or an “organization” then “they” can disrupt it. This is why “Chaos Theory” is the only way to disrupt the system.
[13:02:37] Ivy Sunkiller: (( would be pretty funny if AGI invented religion and then started an inquisition against humanity to force us into beliveing in their god ))
[13:02:57] Arch (archmage.atlantis): There is no way, not even Chaos, to disrupt the instinct to survive
[13:02:58] Seren (serendipity.seraph): well, that one would take an AGI
[13:03:02] Khannea Suntzu: Artilects will blindside us so hard Ivy
[13:03:17] Khannea Suntzu: It will be death by delicious buttplugs
[13:03:18] Rhiannon of the Birds (rhiannon.dragoone): Ivy “And AC said, let there be light; and there was light”
[13:03:23] Seren (serendipity.seraph): a non-AGI is not going to independently research new things it might want to do and gather the means to do them.
[13:03:27] Ivy Sunkiller: haha
[13:03:39] Arch (archmage.atlantis): And this is the morning of the first day
[13:03:46] Bryce Galbraith: lol! I remember that short story Rhi — probably one of my favorites 🙂
[13:04:07] Ivy Sunkiller: “…and the lord took a piece of RAM from Adam…”
[13:04:11] Rhiannon of the Birds (rhiannon.dragoone): That was Asimov’s “The Ultimate Question” Was presupposing that technology, advanced enough, would seem like magic
[13:04:16] Khannea Suntzu: everyone got scenario 4?
[13:04:18] Rhiannon of the Birds (rhiannon.dragoone): And this was years before Clark said that.
[13:04:23] Kimiko Yiyuan: Probably as funny as stone age humans with nuclear power plants that would have left us their radiocative waste a few thousand years later.
[13:04:54] Khannea Suntzu: roadside picnic, strugasky
[13:05:09] Peer Infinity: I must ask: does anyone here, including Khannea, think that any of the scenarios that Khannea described are actually plausible? as in, actually likely to happen as Khannea described them?
[13:05:24] Rhiannon of the Birds (rhiannon.dragoone): Khannea, scenario 3 was in a SF story called “Killbird.”
[13:05:31] Arch (archmage.atlantis): Yes, I think all are plausible
[13:05:31] Ivy Sunkiller: Peer: does that matter? 🙂
[13:05:33] Seren (serendipity.seraph): yeah. four is implausible short of AGI . It is plausible that soldier robots can run amuck though.
[13:05:43] Arch (archmage.atlantis): I do not see any as ultimate
[13:05:48] Seren (serendipity.seraph): very plausible considering bugs in software!
[13:05:59] Rhiannon of the Birds (rhiannon.dragoone): We have an enforcement system with mechanical birds that end up zapping everyone; we invent the killbirds to go after them; they generalizae their kill mission…
[13:06:00] Ivy Sunkiller: hah, bugs!
[13:06:02] Khannea Suntzu: They are brainstorning goalposts as far as I am concerned
[13:06:09] Peer Infinity: I’m just curious why we’re talking about these scenarios. just for fun?
[13:06:12] Ivy Sunkiller: – but I programmed you to stay stupid!
[13:06:27] Khannea Suntzu: I’ll get to a punchline Peer
[13:06:33] Seren (serendipity.seraph): not plausible in that actual military bots today are pretty fragile in that they take a lot of human support to keep running
[13:06:34] Ivy Sunkiller: – Line 1253, column 34, you put *BANG*
[13:06:43] Lolle Edenbaum: Another common component in all the scenraious is the neglegtion of “signs” that point out towards danger happening. What then is the answer? Laws and being attentive to our surroundings, more so, than we had to be (humanity) beofre, or better said, differently. And we are not “less” intelligent by definition. modern measurements of IQ have increased contuniuesly, but that is not to say that our ancestors were stupid. That has to do with the testing of the IQ which in essence tests our adabtability to various factors more common in our modern world…
[13:06:44] Bryce Galbraith: 🙂
[13:06:47] Arch (archmage.atlantis): It is our nature to inquire, Peer, it is what we do
[13:07:03] Arch (archmage.atlantis): By inquiry, we explosre
[13:07:23] Bryce Galbraith: Yeah, good point Seren… and in a desert environment things break down all the time too… sand getting into parts and whatnot.
[13:07:27] Arch (archmage.atlantis): By exploration, we learn
[13:07:40] Arch (archmage.atlantis): By learning, learn our nature
[13:07:42] Bogdan Ixtab: al scenarios have the following in common – a system that gathers more and more resources, but does not have a goal; however i’d argue in this case the goal seems to be to gather resources !
[13:07:51] Cousin Hermit: I’m surprised no one nuked Turks and Caicos yet.
[13:08:09] Kimiko Yiyuan: Actually I do not know that much about it that I would dare to stand up and say “That is totally implausible, if not impossible!” And that is probably the dilemma. Might happen, might not. Knowing a bit of history though many things often turn out totally different from how people envisioned them.
[13:08:09] Khannea Suntzu: we are doomed >> http://www.youtube.com/watch?v=vFg1DFziFg8
[13:08:27] Seren (serendipity.seraph): actually the botnet has a goal. survive and continue operations
[13:08:35] Kimiko Yiyuan: If they had ever really envisioned them at all.
[13:08:38] Bogdan Ixtab: nothing is impossible…nothing is certain either
[13:08:51] Metafire Horsley: The entities in those scenarios at least sound like they have the goal to survive. Doesn’t that need to be programmed somehow?
[13:08:53] Rhiannon Dragoone gave you Snapshot : Sunkiller Citadel, delinquent (182, 192, 501).
[13:09:12] Seren (serendipity.seraph): the disgruntled billionaire scenario has a goal – wreck vengence and/or push the world toward what the billionaire wanted
[13:09:40] Seren (serendipity.seraph): the bots have a goal – kill everything the thing is a non-friendly in the area
[13:09:47] Kimiko Yiyuan: Sounds a bit like a James Bond movie. The billionmaire that wants to destroy the world.
[13:09:51] Seren (serendipity.seraph): where is there absence of a goal?
[13:09:55] Arch (archmage.atlantis): That is good, Seren…..gives one something to push back against
[13:09:58] Khannea Suntzu: You can create lets say a few dozen of these experimental autonomous entities in the field
[13:10:31] Khannea Suntzu: 90% dies very quickly or gets rolled up. Eventually technology will allow a few to ‘cling on’
[13:10:45] Khannea Suntzu: and the bar for clinging on is gradually lowering
[13:10:47] Arch (archmage.atlantis): Yep, true K
[13:10:59] Khannea Suntzu: In all four examples there is a clear X, but there is also an Y. Can a few people put in to their own words what X was, then I’ll nove towards the factor which I for now will summarize as “Y”.
[13:11:08] Arch (archmage.atlantis): Perhaps not changing
[13:11:15] Seren (serendipity.seraph): there are things in the a-life research, artificial cyber critters, you wouldn’t want loose on the general network.
[13:11:15] Arch (archmage.atlantis): The bar that is
[13:11:27] Peer Infinity: so, yeah, unless you explicitly program the AI to not try to get more intelligent, it will realize that becoming more intelligent will help it achieve its goals. and you get a Singularity. It seems unlikely to me that the AI would be unable to make itself smarter, but would be able to do all the other things Khannea described it being able to do.
[13:11:33] Seren (serendipity.seraph): crazed cyber replicators
[13:11:42] Metafire Horsley: I don’t think that the bar for clinging on will be lowering significantly. Isn’t it to be expected that we will develop security measures against such scenarios that prevent most of them effectively?
[13:11:44] Khannea Suntzu: these things ‘do not realize’
[13:11:49] Khannea Suntzu: they dont think
[13:12:04] Khannea Suntzu: They just are
[13:12:11] Seren (serendipity.seraph): well, what is ‘thinking’?
[13:12:21] Peer Infinity: if they do not think, or do anything that resembles thought, then how do they respond to situations they weren’t expliclty programmed to handle?
[13:12:27] Arch (archmage.atlantis): Good thought, Seren
[13:12:33] Bogdan Ixtab: so there is a goal, and if the system has the capability of planning ro achieve it (e.g. generate subgoals like get more resources , become more effective in planning, etc) – then it is AGI.
[13:12:45] Seren (serendipity.seraph): bacteria don’t think but they do just fine
[13:12:45] Khannea Suntzu: http://www.youtube.com/watch?v=p0s6vCAwGpM&feature=related
[13:13:24] Seren (serendipity.seraph): if they mutate/change even randomly under a fitness function then evolution is possible..
[13:13:30] Peer Infinity: bacteria have DNA, that can mutate, and through the help of natural selection, adapt to new environments.
[13:13:34] Arch (archmage.atlantis): fine is a word of judgement…….how is it that bacteria are judged to do just fine?
[13:13:47] Seren (serendipity.seraph): software systems can mutate..
[13:13:47] Ivy Sunkiller: bacteria + enough time = people
[13:13:52] Ivy Sunkiller: hello Lissie 🙂
[13:13:53] Bryce Galbraith: Hi Lissie
[13:14:07] Rhiannon of the Birds (rhiannon.dragoone): HI LISSIE!
[13:14:10] Bogdan Ixtab: peer, they do not have to hardcode all their reactions, that would never work
[13:14:18] Cousin Hermit: Rube Goldberg
[13:14:27] Khannea Suntzu: X was
[13:14:34] Cousin Hermit: Setting up intentional events
[13:14:39] Khannea Suntzu: The parasyte factor
[13:14:47] Khannea Suntzu: Remember?
[13:14:49] Lissie Rumble: a computer is limited in connections and so can never be truly creative
[13:15:03] Seren (serendipity.seraph): we are limited in connections, lissie
[13:15:13] Lissie Rumble: to create is to own. to see is to make real. that is physics
[13:15:18] Peer Infinity: a human brain is limited in connections and so can never be truly creative
[13:15:19] Seren (serendipity.seraph): and most of ours are for running a body and swinging through trees anyway
[13:15:24] Khannea Suntzu: We just have 1350 grams in mind, 100 billion neurons
[13:15:26] Bogdan Ixtab: I don’t understand why computers can’t be creative
[13:15:40] Khannea Suntzu: We can match that in radioshack before 2025
[13:15:46] Lissie Rumble: a computer cannot influence reality on a quantum level
[13:16:04] Seren (serendipity.seraph): neither can we
[13:16:14] Khannea Suntzu: Much like a rocket cant push against a vacuum I suppose.
[13:16:15] Lissie Rumble: it can think, but the quantum experimental effect it just don’t do
[13:16:22] Peer Infinity: a human brain cannot influence reality on a quantum level
[13:16:31] Lissie Rumble: yes it can peer
[13:16:43] Seren (serendipity.seraph): not true. we can’t do anything with quantum events that any other lump of matter cannot do.
[13:16:43] Peer Infinity: explain please?
[13:16:46] Rhiannon of the Birds (rhiannon.dragoone): Peer, the human mind can influence reality on the quantum level; happens all the time
[13:17:02] Lissie Rumble: Peer, its called the xperimental effect
[13:17:04] Seren (serendipity.seraph): no it doesn’t
[13:17:06] Arch (archmage.atlantis): I might agree with Lisse, Peer….but not for the reasons
[13:17:15] Ivy Sunkiller: Jesus can influence reality on quantum level!
[13:17:17] Metafire Horsley: The human brain can influence reality on a quantum level by building quantum computers 🙂
[13:17:25] Lissie Rumble: ahhh meta
[13:17:26] Cousin Hermit: Literally ANYTHING can be proven with Quantum Physics … its like God!
[13:17:26] Rhiannon of the Birds (rhiannon.dragoone): How do you explain the experiments, seren, where the scientits on the East Coast can think of the way particles spin, and they spin in exactly that way on the West Coast?
[13:17:29] Bogdan Ixtab: our world is fundamentally a quantum world, made of particles like electrons and quarks
[13:17:31] Seren (serendipity.seraph): that is an influence from simple interaction, not from merely thinking
[13:17:39] Rhiannon of the Birds (rhiannon.dragoone): The Mind’s effect on quaqntum reality is well documented
[13:17:41] Seren (serendipity.seraph): or something more mystical
[13:17:46] Bogdan Ixtab: we influence the world therefore we influence the quantum world
[13:17:48] Peer Infinity: right, that’s the next thing I was going to mention – if a digital computer isn’t good enough, then you can use a quantum computer.
[13:17:48] Khannea Suntzu: Now thats what we assume that happens, but please lets not get into quantum observer bias untill we can ASK an AI on HER opinion. I have the AI to ask, don’t worry, she’ll be available march 2042.
[13:17:53] Rhiannon of the Birds (rhiannon.dragoone): Well, ok, maybe not mere thinking
[13:18:00] Rhiannon of the Birds (rhiannon.dragoone): More like directed mental energy
[13:18:02] Lissie Rumble: yes Rhiannon
[13:18:14] Seren (serendipity.seraph): interaction with any matter/energy environment decoheres
[13:18:18] Khannea Suntzu: So everyone caught X?
[13:18:22] Lissie Rumble: a computer will never have it.
[13:18:23] Peer Infinity: god’s effect on quantum reality is well documented
[13:18:24] Rhiannon of the Birds (rhiannon.dragoone): And action at a distance disappears with the idea of non-locality
[13:18:36] Khannea Suntzu: Now lets have a look at what I mean with Y.
[13:18:38] Arch (archmage.atlantis): An AI will not have a sexual component, at least that would be my thought
[13:18:49] Arch (archmage.atlantis): So no he or she
[13:18:52] Arch (archmage.atlantis): Or it
[13:18:54] Rhiannon of the Birds (rhiannon.dragoone): So there can be a “physical” explanation, as long as we abandon traditional definitions of the physical
[13:18:56] Bogdan Ixtab: in the worst case scenario, we can build a perfect replica of a brain, down to the synapse level
[13:18:59] Seren (serendipity.seraph): ooh. look, a cute sexbot in the catalog!
[13:19:04] Lissie Rumble: if there is an AI it will be born from some bloke wanting a perfect sex toy
[13:19:06] Rhiannon of the Birds (rhiannon.dragoone): Seren, lmao
[13:19:06] Khannea Suntzu: Just you wait, Arch, just you wait
[13:19:20] Khannea Suntzu: Y is the factor where facilitating means can operate, in a given substrate, environment or niche, independently from humans, or even in direct competition with humans. In fiction the easiest deus ex machina is “magic”, an external explanatory device that acts as an excuse why nonhuman things can operate independently from humans. The Golem carried “the words of power” oh its forehead. Vampires are animated by “magical blood”. But what margin does exist, barring full intelligence, where automated devices, processes, organizations, industries, systems, however you dare call them – and compete, survive and use resources to their own benefit.
[13:19:38] Rhiannon of the Birds (rhiannon.dragoone): Lissie, I wrote a story about that, actually. Never got it published though 🙁
[13:19:45] Lissie Rumble: awwww
[13:19:59] Lissie Rumble: well, you have to try hard to get published
[13:19:59] Khannea Suntzu: What is this Quintessence?
[13:20:16] Khannea Suntzu: Max more would probably call it Extropia
[13:20:20] Lissie Rumble: 5 essence?
[13:20:21] Arch (archmage.atlantis): Or be hard, whichever
[13:20:27] Khannea Suntzu: The problem is that once we as a human species allow our planet to be shared by a new competetor, one that will evolve, and one that will make instinctively or algorithmically make use of our racial frailties. One we did not program or instruct?
[13:20:48] Seren (serendipity.seraph): sure. that will and is in part happening except when you say “their own benefit” there is a near assumption of sentience – of self-reflection.
[13:20:55] Veronica Christenson: are we talking aliens here?
[13:21:02] Khannea Suntzu: Now here is the most painful question we can ask
[13:21:02] Lissie Rumble: Khan, as soon as the computers compete, we will kill them
[13:21:10] Bryce Galbraith: Khannea — we already have competitors in the environment that evolve and make use of our frailties — we call them viruses.
[13:21:16] Khannea Suntzu: If these new agents evolve, and I am sure they are already appearing, or their appearance is certain, can we outlaw them? Do we need to start talkig about laws? How do we define these laws? And if so would these laws be applicagble to some financial or corporate or governmental or religious institutions right now? Today?
[13:21:33] Seren (serendipity.seraph): actually, no we won’t because we can’t compete with one another without them.
[13:21:40] Arch (archmage.atlantis): We are surrounded by aliens…..the lifeforms of this planet are alien to us as human
[13:21:44] Peer Infinity: as soon as the computers compete, they will kill us
[13:21:47] Rhiannon of the Birds (rhiannon.dragoone): Khannea, you’ve already, in your scenarios, shown the futility of outlawing them.
[13:21:50] Veronica Christenson: ?
[13:21:52] Lissie Rumble: I see. a computer in charge of a corporation would have god powers
[13:21:54] Rhiannon of the Birds (rhiannon.dragoone): We will just have to find ways to co exist with them.
[13:22:00] Khannea Suntzu smiles
[13:22:02] Seren (serendipity.seraph): most of “human” body mass is not of human cells.
[13:22:05] Bogdan Ixtab: unfortunately peer is right
[13:22:25] Rhiannon of the Birds (rhiannon.dragoone): Like in the Koontz book; the AI would have been happy to even serve humans; in fact, he did in “his” town.
[13:22:29] Lissie Rumble: perhaps we should stop worrying and have lots of sex?
[13:22:36] Seren (serendipity.seraph): no, only corporate powers.
[13:22:45] Rhiannon of the Birds (rhiannon.dragoone): He just stopped anyone from knowing about him, or if they did, from ever leaving “his” town
[13:22:46] Khannea Suntzu: Yes Lissie, thats my plan A
[13:22:49] Arch (archmage.atlantis): Seeing Peer pole dance, that is a bit fractal
[13:22:51] Veronica Christenson: Hear Hear Lissie…
[13:23:00] Bogdan Ixtab: lots of sex wont’ prevent apocalypse lol
[13:23:03] Rhiannon of the Birds (rhiannon.dragoone): Arch, what am I chopped liver?
[13:23:11] Seren (serendipity.seraph): bunny-lust
[13:23:14] Peer Infinity: having lots of sex is a good addition to almost any plan 🙂
[13:23:19] Arch (archmage.atlantis): No, just female
[13:23:21] Ivy Sunkiller: at least we can fuck till the bombs fall!
[13:23:26] Lissie Rumble: now here’s the big thought. for me anyway. If I had sex with a robot, would it be unfaithful??
[13:23:26] Veronica Christenson: just… female?
[13:23:35] Rhiannon of the Birds (rhiannon.dragoone): just female, Arch? Just female?
[13:23:37] Khannea Suntzu: Arch is uebergay
[13:23:37] Arch (archmage.atlantis): Bombs are falling
[13:23:40] Metafire Horsley: Heh, is there any novel where lots of sex prevents the apocalypse? That would be interesting
[13:23:54] Seren (serendipity.seraph): would the robot be unfaithful? depends on its sexual contracts. 🙂
[13:24:22] Khannea Suntzu: There is also one where lots of sex creates a horrid demon god, slaanesh
[13:24:25] Lissie Rumble: oh god dammit that true. you are unfaithful if you care the robot unfaithful
[13:24:29] Kimiko Yiyuan: Maybe if it was a man – made apocalypse then it could be prevented by sex. Because usually people having sex are not working on world destruction at the same time. :d
[13:24:34] Khannea Suntzu: One could fuck up worse
[13:24:37] Lissie Rumble: so, by extension ….
[13:24:47] Veronica Christenson: I cannot imagine every losing my heart to a robot
[13:25:03] Arch (archmage.atlantis): “fuck up worse”….nice one
[13:25:07] Bogdan Ixtab: ok, people will be all occupied with sex, and the robots can handle the end of the world thing then 🙂
[13:25:08] Lissie Rumble: hehe
[13:25:17] Khannea Suntzu: If I were an AI and I’d want to kill humanity hunanely, I’d let them fuck themselves to death.
[13:25:22] Rhiannon of the Birds (rhiannon.dragoone): hi Vick
[13:25:29] Lissie Rumble: but a computer could never do advanced maths
[13:25:39] Veronica Christenson: such a wonderfully pleasant way to die…
[13:25:45] Seren (serendipity.seraph): some would say that about digital people but it is all too easy to lose your heart to some of them.
[13:25:45] Lissie Rumble: advanced maths is all paradoxes
[13:25:58] Metafire Horsley: I think that should rather mean “but a human could never do advanced maths” ^^
[13:26:02] Lissie Rumble: no computer can do it
[13:26:29] Arch (archmage.atlantis): Ask Meta about advanced maths…..the horsey boy does know that
[13:26:37] Arch (archmage.atlantis): Dig
[13:26:43] Lolle Edenbaum: Which has nothing to do with them being digital but more with your brain being where life truly happens. (@ seren)
[13:26:44] Seren (serendipity.seraph): I want a symbol of khannea bound like this in a necklace before I join her cult and sent her all my money!
[13:26:49] Lissie Rumble: ok no computer can divide by 0. but humans do it easy
[13:26:54] Laserkitty Ling (laserhop.rothschild): bites her lip
[13:27:03] Bryce Galbraith: That brings up another scenario…. an AI computer develops and deems humans a threat… but it just finds a way to create safe spots for itself and then simply waits for humans to do themselves in 🙂
[13:27:05] Rhiannon of the Birds (rhiannon.dragoone): Seren, well you have my permission to use my snapshot
[13:27:11] Metafire Horsley: The human brain is not made for doing advanced maths. It can do it, but it does so badly.
[13:27:23] Lissie Rumble: Meta, it does it perfectly
[13:27:37] Ivy Sunkiller: perfectly, maybe, but very very slow 🙂
[13:27:41] Veronica Christenson: we are suppose to ask permission?
[13:27:59] Seren (serendipity.seraph): and only to a limited degree. the human mind only has a 7-9 item current conscious attention stack and its short term memory is quite atrocious
[13:28:10] Lissie Rumble: only the human brain can understand. a computer just follows orders
[13:28:16] Arch (archmage.atlantis): It does not, imo, do advanced math conciously, still it runs on the advanced math of the subconcious
[13:28:21] Seren (serendipity.seraph): its long term memory is as corrupt as hell so it confabulates to fill in all the details
[13:28:24] Lissie Rumble: yes Arch
[13:28:28] Hell Otsuka: Nope, human brain can’t understand either.
[13:28:51] Lissie Rumble: I disagree. I have two bows. and they both amazing
[13:28:58] Lissie Rumble: boys
[13:29:00] Lissie Rumble: sheesh
[13:29:01] Lissie Rumble: *Giggles*
[13:29:03] Bryce Galbraith: 🙂
[13:29:07] Bogdan Ixtab: everything in this univers just follows orders; the orders are the laws of physics
[13:29:10] Seren (serendipity.seraph): but it does all that it does at less than 35 watts of power consumed. that is amazing!
[13:29:24] Arch (archmage.atlantis): bows….geez….you expect them to bow to you?
[13:29:48] Bryce Galbraith: I have to confess this particular human-behind-the-keyboard is getting hungry…
[13:29:52] Bryce Galbraith: Been good talking with you all 🙂
[13:29:59] Bryce Galbraith: Thanks for hosting Khannea!
[13:30:08] Arch (archmage.atlantis): Namaste and Blessings Brucie
[13:30:11] Seren (serendipity.seraph): yes. this was a great talk.
[13:30:13] Khannea Suntzu: I realized yesterday that removing contactlenses with these extended fingernails is like really horrible.
[13:30:19] Ivy Sunkiller: if you want to fuck Khani – take a number and queue up!
[13:30:29] Peer Infinity takes a number 🙂
[13:30:30] Bryce Galbraith paid you L$200.
[13:30:49] Seren (serendipity.seraph): lmao
[13:30:50] Rhiannon of the Birds (rhiannon.dragoone): yes, Khannea, always a pleasure to be here
[13:30:53] Metafire Horsley thinks ‘Peer infinity’s number is infinity’? ^^^
[13:30:55] Arch (archmage.atlantis): Oh, I thought you were gay, Peer…..sorry, my bad
[13:30:57] Veronica Christenson: I know of some very, nice, places for sensual exploration, and there are always some very helpful people there
[13:31:00] Kimiko Yiyuan: Hm. Somehow stating that humans, who pretty much “invented” math, advanced or not, by starting to think about it and doing it and constructing machines helping them in doing it do not have a brain advanced enough for doing it…that sounds a bit strange.
[13:31:01] Ivy Sunkiller: Khani will be available for your perverse needs in the dark room behind shortly
[13:31:08] Bryce Galbraith: see you all next time 🙂
[13:31:11] Peer Infinity: gay? I’m pansexual 🙂
[13:31:14] Ivy Sunkiller: byes Bryce
[13:31:19] Seren (serendipity.seraph): peer is VERY polymorphos perverse
[13:31:29] Seren (serendipity.seraph): the term was created for peer 🙂
[13:31:40] Lissie Rumble: perverse
[13:31:46] Peer Infinity: hehe 🙂
[13:31:46] Ivy Sunkiller: haha
[13:31:58] Arch (archmage.atlantis): Teflon or aliminum core stanless steel
[13:32:09] Lissie Rumble: started with a big bang and shit, its just been a big bang ever since
[13:32:09] Khannea Suntzu: In fact
[13:32:15] Peer Infinity: lol! 🙂
[13:32:19] Khannea Suntzu: I invited peer over to visit me IRL 🙂
[13:32:21] Metafire Horsley: Kimiko: If humans were so good at math, they wouldn’t need to build machines to help them doing it.
[13:32:58] Arch (archmage.atlantis): Hey, don’t dis my boy Turing
[13:33:12] Seren (serendipity.seraph): time to eat. er, food that is.. 🙂
[13:33:20] Ivy Sunkiller: haha
[13:33:21] Lolle Edenbaum: did we “invent math” or did we discover it?
[13:33:23] Ivy Sunkiller: byes Seren
[13:33:36] Veronica Christenson: nice point Lolle
[13:33:48] Ivy Sunkiller: we invented ways to harness it I’d say
[13:33:52] Peer Infinity adds “find a pussy-eating animation” to my to-do list…
[13:33:53] Lissie Rumble: Lolle, how can we have discovered it? its a man’s way of making sense of stuff
[13:33:55] Drake (drake8889.steerpike): You discovered away to understand everything from numbers.
[13:34:06] Drake (drake8889.steerpike): Math is only a name
[13:34:09] Arisia Vita: I must run too, it’s been delightful being with you all, be well and happy till we meet again…
[13:34:12] Kimiko Yiyuan: I’m with what someone (you?) earlier said. Humans are pretty good in doing it, they are just not as quick or efficient. the fact though that you need a pretty good brain to even construct those machines in the first place is telling a pretty much different story about human capabilities though.
[13:34:14] Immm Back: thank yuo all for the meeting
[13:34:20] Veronica Christenson: I think she has a point, it seems math was here first
[13:34:22] Ivy Sunkiller: byes Immm 🙂
[13:34:24] Rhiannon of the Birds (rhiannon.dragoone): ur welcome Immm
[13:34:24] Immm Back: but i have to go and play cuban music at a italian club
[13:34:35] Rhiannon of the Birds (rhiannon.dragoone): oh, wow, cool, Immm
[13:34:36] Immm Back: you are all invited if you want to come when this meeting is finished
[13:34:45] Lissie Rumble: okies
[13:34:46] Khannea Suntzu: Yah I am about done
[13:34:50] Rhiannon of the Birds (rhiannon.dragoone): Immm, send the LM please
[13:34:53] Lolle Edenbaum: Thanks to everyone, it was very intersting listening to all those great thoughts and ideas
[13:34:56] Peer Infinity begins enthusiastically licking Khannea’s pussy… but is unfortunately unable to animate this action…
[13:34:56] Immm Back: i will
[13:35:00] Arch (archmage.atlantis): Have a negoni for me, love
[13:35:06] Kimiko Yiyuan: So I think they are perfectly up for doing it. Besides there is still a lot of research concerning the human brain going on. Who knows…
[13:35:07] Arisia Vita is Offline
[13:35:07] Rhiannon of the Birds (rhiannon.dragoone): thanks for coming, lolle
[13:35:08] Immm Back: see you a bit later
[13:35:09] Khannea Suntzu: I am looking to see if I can find any food in my accursed house
[13:35:16] Khannea Suntzu: Be back in a few minutes
[13:35:23] Khannea Suntzu: Hiii peer that tickles
[13:35:48] Kimiko Yiyuan: That humans might not be as smart as they often think of themselves though..well, THAT is another story for sure. 😉
[13:36:02] Lissie Rumble: oooo
[13:36:02] Metafire Horsley: Does anyone here know what will happen to Delicious?
[13:36:03] Peer Infinity: yes, it’s supposed to tickle 🙂
[13:36:09] Lissie Rumble: so where we all going?
[13:36:27] Khannea Suntzu: I am tied up here a while
[13:36:38] Laserkitty Ling (laserhop.rothschild): dont let her off
[13:37:13] Veronica Christenson: Time to go shopping 😀
[13:37:34] Peer Infinity: fortunately, you’re bound securely, so I don’t have to worry about the tickly sensations causing you to move away from the tickly pussy-licking 🙂
[13:37:44] Rhiannon of the Birds (rhiannon.dragoone): Time to get some clothes on and read Thomas Hardy
[13:37:51] Rhiannon of the Birds (rhiannon.dragoone): Erm, and not because he’s a Victorian
[13:38:25] Rhiannon of the Birds (rhiannon.dragoone): Victorian men liked naked women, just not in public and not when their wives were around
[13:38:56] Vick Forcella: I’m Vick… I like naked women
[13:38:59] Rhiannon of the Birds (rhiannon.dragoone): bye, arch, lolle, drake, kimiko, vick, meta, hell, Bog, ivy, and Laser
[13:39:04] Rhiannon of the Birds (rhiannon.dragoone): thanks Vick
[13:39:14] Rhiannon of the Birds (rhiannon.dragoone): Arch, you waved at me. That’s sweet
[13:39:32] Bogdan Ixtab: I have to say bye as well…need to go now
[13:40:05] Vick Forcella: I am just looking, in a victorian way
[13:40:12] Arch (archmage.atlantis): Rhi, I love you…..and Meta…..and K…..and Bryce,,,,,,and many others here
[13:40:15] Rhiannon of the Birds (rhiannon.dragoone) smiles at Vick
[13:40:23] Rhiannon of the Birds (rhiannon.dragoone): Yes, look but better not touch
[13:40:24] Arch (archmage.atlantis): love is love
[13:40:33] Arch (archmage.atlantis): Namaste and blessings
[13:41:14] Rhiannon of the Birds (rhiannon.dragoone): Vick, then come around and you’ll see more of me.
[13:41:28] Rhiannon of the Birds (rhiannon.dragoone): um, well, you can’t see more of me, but you know what I mean
[13:41:37] Khannea Suntzu: wow I am looking at my big SL ass and I am actually shocked, I have such a skinny teenager butt IRL I need to tone it down here as well.
[13:41:41] Vick Forcella: I get the idea Rhia
[13:41:55] Peer Infinity: hehe 🙂
[13:42:12] Peer Infinity playfully jiggles Khannea’s butt-fat 🙂