JavaScript is required to use Bungie.net

OffTopic

Surf a Flood of random discussion.
Edited by Ted Bundy: 6/30/2013 11:24:13 AM
8

Superhuman artificial intelligence - if or when?

[quote]The technological singularity is the theoretical emergence of superintelligence through technological means.[/quote] Do you guys think that as technology advances we will eventually hit a point where we create a machine with intelligence greater than that of any human being? I mean when you look at it computers, for example, are getting faster [i]faster.[/i] [quote]A technological singularity includes the concept of an intelligence explosion, a term coined in 1965 by I. J. Good. Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia. However, with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity.[/quote] I do believe that there will be a point in time during the 21st century where we create artificial intelligence that is smarter than humans. The AI would be experiencing its own evolution where it sees large improvements in its intelligence and the rate of which those improvements occur - all in order to compete with the other AI. At this point the AI has absolutely no obligation to promote the existence of their far less intelligent human creators and could chose to end the human race if they chose. The end of the human race could still happen in this scenario even without malicious intent as the AI could simply use the resources humans need for survival. Essentially it could be analogized AI : Humans :: Humans : Dogs in terms of intelligence. What do you guys think? It's kinda confusing for me but it is interesting at the same time. Ray Kurzweil, the man bill gates calls, "the best person I know at predicting the future of artificial intelligence," predicts the singularity to occur around 2045. I don't think it's a matter of it, but when.

Posting in language:

 

Play nice. Take a minute to review our Code of Conduct before submitting your post. Cancel Edit Create Fireteam Post

View Entire Topic
  • If humanity did create a super intelligent A.I. then why would it waste it's time on humans? It would probably go to the stars seeking new civilizations and killing them...or it could kill the human race then go to the stars and kill new civilizations.

    Posting in language:

     

    Play nice. Take a minute to review our Code of Conduct before submitting your post. Cancel Edit Create Fireteam Post

You are not allowed to view this content.
;
preload icon
preload icon
preload icon