Technological Trends are Arrows Pointing Toward Inevitability
In the movie “I, Robot” the actor Will Smith played a detective who distrusted robots. But all of the other characters in the film believed he was paranoid because, after all, what could possibly go wrong? And therein lies the irony of technology today: The human mind can conceive of the most imaginative ways to apply science while simultaneously ignoring the warnings from history and the pernicious potentialities of Pandora’s Box.
The fifth article I posted on my blog was dated three years ago today. It was titled “How to Transplant a Human Head” and remains, even now, as one of my favorites; perhaps because it so directly addressed the Pandora’s Box of my own fear. In that article, it was described as follows:
… therein lays the dilemma of opening Pandora’s Box. …Because, the more things change, the more they stay the same. Mary Shelley’s Frankenstein gives rise to Philip K. Dick questioning if Androids Dream of Electric Sheep. The next thing you know, Harrison Ford is a Blade Runner tracking down a robot that crushed the head of its creator with its own hands while using its human-like, hydraulic digits to gouge in the creator’s eyes.
There seems to be a pattern here. First, the seed of mankind’s imagination impregnates reality and soon a premature Zombie Baby is born with black eyes and a cannibalistic desire for its own parents.
So what’s the point, you ask? It’s about technology being like fire; how it can both warm and burn. And it seems for decades now someone has been turning up the heat. In fact, so much so, I often find myself wondering if our nightmares will soon come true via hellfire and convenience and by the vain imaginings of mankind.
Since the 1980s there has been a concerted effort toward the development of reduced instruction set computing (RISC) architectures for computer processors. Also called ARM (Advance RISC machine) architecture it is, today, used in the world’s fastest super computers as well as many of our modern smart devices such as phones, tablets, iPads and Androids. Although there have been many RISC designs, the “most public” of these were, according to Wikipedia:
…the results of university research programs run with funding from the DARPA VLSI Program.
Of course, The Defense Advanced Research Projects Agency (DARPA) is, according to Wikipedia:
…an agency of the United States Department of Defense responsible for the development of emerging technologies for use by the military.
And VLSI is, according to Wikipedia, “DARPA’s Very-Large-Scale-Integration (i.e. VLSI) Project which:
…provided research funding to a wide variety of university-based teams in an effort to improve the state of the art in microprocessor design.
Isn’t it something to see when a plan comes together?
Because in the early 1990’s I was using a Motorola bag phone in my car complete with a cigarette lighter plug-in, a magnetic portable antenna on the roof, and 20 free minutes a month. Later, I had a Nokia hand-held phone with more coverage and better reception, as well as an illuminated LCD display and an alphanumeric scroll feature.
Then, sometime in the mid-nineties, I read an article that I wish I could read again, but it is now just a memory. The piece discussed a breakthrough of sorts in micro-processing that would allow our phones to become mini-computers and with various functions that, at the time, seemed incredulous.