Re: virus: kurzweil cuts the mustard

MemeLab@aol.com
Wed, 30 Dec 1998 18:39:37 EST


In a message dated 12/30/98 4:49:30 PM Central Standard Time,
Sodom@ma.ultranet.com writes:

<< > The point is that we don't understand consciousness in people,
> so we can't build it into machines.

We dont understand a lot about electons either, but we can make em do a lot
of
tricks. >>

Consciousness, whatever it is, is something that evolves in a particular
informational processing system, namely our brains. I don't think it is
something that we can "build" into a machine. I think that if you build an
information processing machine with the appropriate architecture and the
appropriate operating parameters, and allow it to interact in an appropriate
cultural and physical environment, it will evolve a consciousness, whatever
that is, without us having to "build" it or program it like a typical linear
processing computer (program is what I assume by "build").

I don't think that it is necessary to completely understand consciousness to
set up the circumstances for it to most suredly evolve/emerge. Although I
imagine that understanding crucial details this architecture, these
parameters, and this environment would go a long way toward understanding
consciousness. Understanding what is crucial however, is not necessary to
happen upon the right circumstances. Lesser (but significant) understanding
than this can get us there as well.

>>> If you believe that any
> sufficiently powerful computer would be conscious, then that's
> just blind faith.
> --
> Robin

I would imagine that it is a little more than just sheer power. But I think
that if the archicture, parameters, and environment are met - whatever those
are and however that is accomplished - that consciousness will emerge. That
isn't blind faith. I am one such computer, and so, I assume, are you and the
other participants in this discussion. Nobody had to actually understand
consciousness for this to happen.

-Jake P.