Monday, April 15, 2024

LL 34

 This week I wrote an incomplete essay on the technological singularity, here it is:

The technological singularity is the idea that if humans could build an AI smarter than themselves,

it would be a better designer and hence be able to make an even more intelligent AI. This process would

then continue and possibly produce an infinitely intelligent being.


It would be nearly impossible to predict what life would be like for the average human if a superintelligent

being existed; however if the AI is motivated to create better and better versions of itself, it may try

to take over the world and use the world's resources for more computing power.


A self improving computer does not however necessarily produce a superintelligent being. If, for example,

it becomes harder and harder for the computer to improve itself it may take an ever increasing amount

of time for the computer to double in intelligence.

If the computer takes a fairly consistent amount of time to double in intelligence the graph

would look more like this.

While this would lead to exponential growth it does not approach infinity, if however it takes less and less time to double in intelligence it will in theory reach an infinite amount of intelligence in a finite amount of time.


No comments:

Post a Comment