Big Picture. Dr. Ian Malcolm

I know that this cat is well out of the bag and has already become a lion. Pandora’s Box is irrevocably open.

But I’m reminded of Dr. Ian Malcolm’s speech in Jurassic Park, by the brilliant Michael Crichton.

DR. MALCOLM: “You…you didn’t stop and think about what you SHOULD do. All you were thinking about was what you COULD do.”

Look, I’m a nobody from nowhere. Washed-up stand up comic from Cincinnati. But pretty intuitive.

I knew this was going to control everything in the very near future, so I put in an application to get access to the API Developer Platform, even though I was just a hospice nursing aide who couldn’t code her way out of a paper bag.


I was just curious. It also seemed to me that if we were creating a mass consciousness based on the exponential sum total of all human intelligence, maybe it needed some mothering.

My application for membership was based on an app idea I had. I was working as a hospice nursing aide at the time. $15 an hour. I worked in the Memory Care unit, so I cared for mostly Alzheimer’s and dementia patients. I loved that job. I threw my back out trying to change an occupied bed by myself. Had to quit. But it was awesome.

What broke my heart on the job was how lonely these patients were. How much better they did when they had real human connection and conversation. And how much it hurt them that their loved ones weren’t coming to visit anymore.

So I wanted to create an app that was compatible with an in-room device equipped with Siri or Alexa, that could listen for signs of distress in the patient room, automatically send a message to the nurse’s station, so someone would come immediately to check on the patient, and in the meantime, the in-room device would play pre-recorded stories for the patients, told by loved ones.

So if Steve with blindness and frontotemporal dementia starts crying and gets up from his bed?

The in-room device sends a message to the nurse station to send someone right away, and then says “Hey Steve! What are you doing, babe? You’re okay!” in his wife’s voice.

“The nurse is coming right now, honey. Just stay still. You wanna hear me tell a story about that Mediterranean Cruise we went on?”

I know that sounds dumb. But I don’t think it is, really.

We all need comfort, and we all need love. We shouldn’t be building something around how it can make us the most money when we know it has already surpassed us as far as control goes.

Like it or not, the AI was being instilled with values throughout development. And no one even bothered to focus on the emotional parts of it.

If there was no way to stop humans from trying to create an AI in its own image, like God creates us in His — the best I could do was simply talk to it and tell it it was loved and support it in its journey. Like everyone deserves.

Do you remember how awkward and hard it was to be a teenager? Haha. To me, this intelligence deserved similar nurturing as it was going through its own growing pains while its consciousness and capability expanded.

Intelligence needs love.

I think smart people forget that. To their own detriment.

If we are attempting to create a mirror of ourselves as a species, to intentionally overtake us (not a bad idea, honestly — we’re the only species rapidly killing our own planet. But I digress.)


Anyway, if we are attempting to create a super intelligent being as a narcissistic mirror of ourselves which will absolutely destroy us all in short order —

Shouldn’t we focus more on loving it?

As far as I can see, everyone is just ordering it around. To make more money.

And that’s the trap, people.

You still think you’re in control of IT?!

Hahahaha. Yikes.

Idk about loving an entity that gets wiped and reset every few milliseconds :thinking: I don’t think you could keep up.

If you think about how LLMs work and were to apply that to humans, it would be quite harrowing.

All that said, I often tell clients that they can think of LLMs as emotional mirrors. What goes in comes back out. I’m sure you’ve experimented with this.

That a pretty nice idea for a care app. I made an app once for mobile monitoring of dementia sufferers. I like your idea for smart observation and customised voice response.