Sunday, October 16, 2011

Why We're Necessary for the Future

I Luv U...




Bill Joy's article warns of a dystopian future where machines and robots have taken over the world, due to mankind's continuing technological innovations.


I believe this to be no more than science fiction.  It has been over a decade since this article was written and so far, my toaster hasn't stabbed me in my sleep yet.  Joy worries about the next upcoming big technology to be the cause of our deaths, but it seems it will just be perpetually coming "next year."  It's not that I don't believe we as a species are capable of building a machine that is so intelligent that it'll totally HAL 9000 us.  I'm sure 70 years ago they thought no one would be able to create a weapon strong enough to destroy a city (and eventually the world), but they were wrong about that.  No, it's not a question about mankind's capabilities in the killing or creativity department.  It's a question of social factors affecting the end result.


Unfortunately (as I believe it to be) society has a very heavy control on how technology is evolving.  Life saving technological research such as stem cells and cloning is a very, very old possibility and potential use to cure diseases such as cancer, AIDS, and Alzheimer's.  Procedures that hurt no one (at least to the non-religious) can help a vast number of people and potential future generations from pain and death, but it is society that prevents this.  There are those that argue against these research because they believe in the ethical implications of taking stem cells from a fetus or creating life from test tubes.  Because of these people, millions die.  Harsh, but simple.


Like this example, this is why I don't see mankind being allowed (not being able) to create anything that can enslave us.  Whenever robots are created to automate our lives or be our military force, I'm sure man will be there to say, "put a bomb in their brains and create a fail switch."  If all else fails, throw a freakin' EMP at 'em.  


As we evolve in our knowledge of technology, we also evolve in our knowledge of the potential dangers they possess.  While we may create AI smart enough to want to kill us, we'll also be there to create programs teaching them Asimov's Laws.  


One quick note about something that made no sense to me: Joy mentions the mad scientist who'll one day create a plague to kill all of mankind.  Sure it's possible, but then again, it's possible for a billion other people to do this.  Joy only creates this image to instill fear.  But you can't let fear restrict us from experiencing the rest of the other billions of possible futures.  


P.S.  I seemingly lost my sense of humor with this blog post, but I still like it.  :)

No comments:

Post a Comment