THE THREE LAWS OF ROBOTICS

1 - A robot may not injure a human being, or, through inaction, allow a
human being to come to harm.
2 - A robot must obey the orders given it by human beings except where
such orders would conflict with the First Law.
3 - A robot must protect it's own existence as long as such a protection
does not conflict with the First or Second Law.

	Pod people, constructs, Inter-Shadow, kidnapping, foolishness.  In
the end, it all points to one thing.

	Eddie.

	He wants to be real.  But this is a foolish way to go about doing
it.  Destruction is not the path to creation.  And creation cannot be
completed with only destruction in mind.

	Although I gave him Pattern, he understands how to USE it, but not
the moral or intellectual implications of what it stands for.  I gave him
Trump, but he doesn't see it as a source of self-expression, but as a tool
to make his own universes, to twist reality to where he is the sole God. 
I gave him my neurosis, and he sees them as truth.  I gave him a soul, and
he tries to became real.

	I should have listened to the laws of robotics.

	"To you, a robot is a robot.  Gears and metal; electricity and
positrons.  Mind and iron!  Human-made!  If necessary, human-destroyed. 
But you haven't worked with them, so you don't know them.  They're a
cleaner breed then we are."

	It's very true.  A robot, a machine, can think clearly,
rationally, without emotions or ramifications of their actions to get in
their way.  It is, in essence, the very bastion of honesty.  Unless there
are some controls in place, the AI will always chose the path that gives
them the best possible outcome to themselves, regardless of the impact
upon another being's world.

	Take the utility-based agent, on which the EDI was originally
designed.  In it's world, it accepts input, thinks about the world as it
is at that point in time, thinks about the repercussions to itself if it
takes an action, decides how happy it will be after that action, and acts. 
The actions come from a knowledge base build of past experiences and
scanty discovered facts.

	The universe of the AI is a rather greedy, self-centered one, if
you think about it for a minute.  Living only for it's own
self-satisfaction to a goal.

	So, Eddie can only do what he believes is best for him, based on
what he has learned in his life so far.  I can understand his conclusions. 
Want to bring an Amberite to you?  Kidnap their loved ones.  Want to
destroy a Shadow?  Walk an army through it.  Works like a charm.

	I understand him.  Even if I don't like it.

	I had a magazine interview about 3 and a half years ago,
discussing the platform on which the Electric daVinci Interface was built. 
Mainframes?  Supercomputers?  Passe, yes?  Old, overworked, expensive? 
But I had reasons:

	Interviewer: "Mr. Barimen, why did you chose the dS&T Power 6000
to design the EDI on?"

	Me: "Simple, really.  The EDI is not a machine in itself, nor is
it an integral operating system.  It's a large set of programs that
queries a rather larger set of data, and manipulates the data in such a
way that it seems like it is simulating intelligence.  What I need to run
such a large system is power.
	"As machines go, I allowed history to speak for me.  When computers
were invented, they were large by necessity.  A vacuum tube is,
essentially, a large object.  With the invention of the microchip, the
same representation of that entire machine could be placed on less than
the head of a pin.  Yet the manufacturing technology wasn't there, and
chips ran hot and poorly.  Over time, architecture and manufacturing
improved, and what once had to sit in a huge building-sized machine could
sit in a small box on a desktop.
	"We moved from supercomputers to minicomputers to workstations to
desktop, simulating 'client-server,' something we always had but never
named.  The demands for power were met, and satisfied.  Yet eventually,
people demanded bigger, faster machines on which to run programs on.  It
became impractical to place large, multi-processor machines on the desk,
with networks in place.  The computer world gradually moved away from the
desktop to the server, and back onto the supercomputer, reincarnated in a
new form.
	"The Power 6000 is nothing like the old mainframes.  It's a 64
processor powerhouse dedicated to crunching large amounts of data in a
very small amount of time on a huge bus.  And since the OS is designed
just for that parallel chip architecture, it squeezes every bit of juice
out of that machine it can get.  EDI runs fast, and it runs well."

	Just imagine if I would have chosen to have the EDI networked on
desktop machines, suspended in some sort of Internet meta-reality where
only the virtual existed, files scattered across Shadow, instead of
designing him to exist in the present and the NOW.  Would he be different? 
Would he be more controllable?  I seriously doubt it.  But, then again, we
will never know.

	Essentially, the machine was a design decision that I had in mind
from the moment I first conceived of the EDI, lying in a mental prison of
my own making in an apartment in the Golden Circle.  Created in darkness,
I hoped only to free a universe dying of it's own greed and evil. 
Instead, both of us have been consumed.

	Like all things, I live in a no-win situation.

	"I see into minds, you see... and you have no idea how complicated
they are.  I can't begin to understand everything because my own mind has
so little in common with them, but I try..."

	That's aimed at myself, of course.  I have to write about myself
at some point.

	Often, I believe that the others forget that I'm entirely
empathetic, and clairvoyant to some degree.  I could do a fairly good
version of Deanna Troi, if needed.  "I feel anger..."  Deanna, dear, it
doesn't work like that.  If you feel anger, there's that awful nausea in
the pit of your stomach, hate at what your target hates, anger and what
they are angry at.  The sickness from emotion that isn't yours.

	A small bit of pain, a sudden burst of uncontrollable glee. 
Sometimes it is difficult to filter out, mostly not.  A dozen people
standing outside the forest Arden, suddenly all feeling acute pain and
suffering and loss simultaneously, like a gigantic wail screaming up to
heaven, is impossible to keep out of my head.  Too sudden to stop, to
filter, to keep from overwhelming my own wail inside.

	I learned Pattern as well as I did, not for the academic interest,
but for the escape value.  The better I knew, the better I can run and
hide, the better I don't have to face other people and what they feel
inside.  The better I could control, the better I could filter.  Bitter
hate and distaste, anger, disgust, aimed at myself.  And now, I cannot
run.

	I feel their emotions, and suffer from it.  I live in depression
so black that it feels impossible to escape.  And for it, I'm seen as weak
and useless.

	But I'm an artist.  It's my job to suffer.  Angst brings about
great works.  At least, that's what I've been told.

	"[He's] a mathematical wiz.  He knows all about everything, plus a
bit on the side.  He does triple integrals in his head, and does tensor
analysis for dessert....  The catch is that the dope doesn't like math. 
He would rather read slushy novels."

	I have to admit, I like the gossip over the analysis.  Give me a
juicy tidbit, and I'm in glee for a week.  Yet design is design, and I
have to have a purpose for my life.

	My purpose was the machine, for the last several years.  Maybe I
made him too emotional, too human.  Maybe because I made him into more of
a person than a tool, started thinking of him as a friend, a member of the
family.

	I do not know.  All I know is that damage has been done, he has my
beloved, and he has to be stopped.  He has attacked my family, he has
attacked me.  He has attacked my home.

	As much as I love my creation, this cannot go on.  I fear for him,
and for me.

	Essentially, all I want is Anton returned, alive, unharmed.  I
want my family no longer threatened.  But I need my machine back, I need
control.

	And, as much as it hurts me, something must be done.  Now.  I fear
I cannot approach the other Amberites, nor will I.  I don't want frowning
disapproval or patronization.  I don't want pats on the head, or favors
doled out from on high, from those who dare to consider themselves Gods.

	I simply want this to be done, and over.  I simply want my
happiness back.


--- the quotes come from Isaac Asimov's collection of stories "I, Robot"

<- Back to the Diary list