Thursday, September 23, 2010

The "Devicecification" of America....

How many devices do American have?  How many do they need?  What are these devices doing to us?  (Note: This article is dedicated to my mother Gloria who was and continues to be a great inspiration in my life.)

It all started for me with a headline I read in Mashable today.  It got me thinking.  82% of Americans carry Cell Phones.  Seems reasonable enough.  They can be life-saving devices (and at times death-causing devices).

But I began to wonder how many other devices Americans carry with them.  Most have laptops. They carry a cell phone.  They carry key, keys for the house, car, boat, office, etc.  They carry a wallet.  Inside this wallet can be a universe of items.  They might carry different kinds of credit cards that are to be used in certain situations only.  They carry identification of different types.  Inside these credit cards there is also a universe in miniature.  There is no doubt that in the near future, credit cards will become small computers using Card 2.0 technology.    These "smart" credit cards will have more processing power in them than the computers were aboard the Apollo Missions.  

So let us first try to define what a device is.  Most think of it as synonymous to gadget, some manufactured machine which usually serves a single purpose.  But is there another definition of a device?  Listen to this quote from a medical software engineering company
Many people would not consider software – a set of coded instructions -- to be considered a "medical device." However, as software takes on more critical diagnostic and therapeutic roles, the consequences of such "device" failure can be catastrophic. That explains why the European Union’s latest directives follow the FDA’s lead in treating certain stand-alone software or embedded software as medical devices...
In fact, with the virtualization of hardware that is going on in the software industry there may be a reduction of hardware devices that will occur.  A typical example of this is the calculator that we use in our computers.  It looks like a physical calculator but it is all in software.  It is still a device.  Soon even that traditional calculator like interface on the software may disappear as the people that remembered what a physical calculator looked like will be replaced by those who never used a physical calculator.  Google replaces that calculator interface now with a new one.  Anyone with an iPhone or any other smart phone carries hundreds of apps or devices in it.  If you look at even the simplest cell phone you will see how many functions  or devices it has within it.  So if we look at devices in this way, then it is clear that the average American carries hundreds of devices if not thousands.  This was not the case with his great grandfather of 80 years ago.


In our pockets, we carry a macrocosm, which in turn is a microcosm of a bigger macrocosm.  We are slowly becoming one with our devices.  Without them we will not be able to function in this ever evolving, complicated society.  And this the beginning of a trend.  Ever increasing power, quietly humming under our senses, these devices will be doing whatever it is they are supposed to do.  All these devices will talk to each other in an ever entangling web of computer chips.  All physical devices will talk to each other through the internet, wirelessly  This will be the internet of things (web 3.0) which will unify all physical things with online things until they will be indistinguishable.  They will what they are and only know some simple things that they have to do.  But this architecture of the miniature will achieve things that even the greatest emperors of the Earth would not have been able to do. Like landing a plane safely.

The Blurring of Human and Machine
This devicification as I call it, is changing US.  Some look at the brain as something that changes slowly over generations in some slow evolutionary process.  Although this may be true, it misses a very important point.  The brain is constantly changing and growing OR, degenerating as the case may be by external stimuli.  Your brain is sensing all around you and always in a state of adjustment.  Susan A. Greenfield a neuroscientist at Oxford University says the following,
...the brain is not the unchanging organ that we might imagine. It not only goes on developing, changing and, in some tragic cases, eventually deteriorating with age, it is also substantially shaped by what we do to it and by the experience of daily life. When I say "shaped", I'm not talking figuratively or metaphorically; I'm talking literally. At a microcellular level, the infinitely complex network of nerve cells that make up the constituent parts of the brain actually change in response to certain experiences and stimuli.
Perhaps our brain could be considered a device also made of an almost infinite amount of smaller "devices."  If so, it by the far the most complex.  So it is correct to say that your brain, my brain is very different at a microcellular level, and your grandfather's brain due to the different stimuli we've experienced.  Dr. Greenfield goes on to say more about technologies effects on the brain,
Already, it's pretty clear that the screen-based, two dimensional world that so many teenagers - and a growing number of adults - choose to inhabit is producing changes in behaviour. Attention spans are shorter, personal communication skills are reduced and there's a marked reduction in the ability to think abstractly.
Some have questioned the scientific truth of her statements.

Yet the changes keep being recorded for the brain.  In a recent article in the New York Times it states,
The technology makes the tiniest windows of time entertaining, and potentially productive. But scientists point to an unanticipated side effect: when people keep their brains busy with digital input, they are forfeiting downtime that could allow them to better learn and remember information, or come up with new ideas.
A lab experiment with rats revealed the following,
...when rats have a new experience, like exploring an unfamiliar area, their brains show new patterns of activity. But only when the rats take a break from their exploration do they process those patterns in a way that seems to create a persistent memory of the experience.
Of course this only deals with the supposed abuse of technology not the use of it.  Another recent New York Times article went further however.  Commenting on the results conducted by Eyal Ophir at Stanford University it states,
The multitaskers then did a significantly worse job than the non-multitaskers at recognizing whether red rectangles had changed position. In other words, they had trouble filtering out the blue ones — the irrelevant information. So, too, the multitaskers took longer than non-multitaskers to switch among tasks, like differentiating vowels from consonants and then odd from even numbers. The multitaskers were shown to be less efficient at juggling problems.
Further on, it comments on the fears some researchers have on the effects of video game technology may have on younger developing minds saying, "...that constant digital stimulation like this creates attention problems for children with brains that are still developing, who already struggle to set priorities and resist impulses."


There are however some positive things related to those who "multitask,"
Preliminary research shows some people can more easily juggle multiple information streams. These “supertaskers” represent less than 3 percent of the population, according to scientists at the University of Utah.

Other research shows computer use has neurological advantages. In imaging studies, Dr. Small observed that Internet users showed greater brain activity than nonusers, suggesting they were growing their neural circuitry.

At the University of Rochester, researchers found that players of some fast-paced video games can track the movement of a third more objects on a screen than nonplayers. They say the games can improve reaction and the ability to pick out details amid clutter.

 *.*
What is lacking here I think is a historical perspective.  All new technologies have caused paradigm shifts.  The printing press did,
Possibly no social revolution in European history is as fundamental as that which saw book learning (previously assigned to old men and monks) gradually become the focus of daily life during childhood, adolescence and early manhood.... As a consumer of printed materials geared to a sequence of learning stages, the growing child was subjected to a different developmental process than was the medieval apprentice, ploughboy, novice or page.
 *
Was the arrival of the printing press seen as total progress?  It all depends on your perspective.  To the older generations, no doubt books seemed unwelcome.  They, no doubt, contributed to an erosion of respect for your elders.  The printing press increased skepticism.

Neuroplasticity
There’s been lots of hand-wringing about all the skills they might lack, mainly the ability to concentrate on a complex task from beginning to end, but surely they can already do things their elders can’t—like conduct 34 conversations simultaneously across six different media, or pay attention to switching between attentional targets in a way that’s been considered impossible. More than any other organ, the brain is designed to change based on experience, a feature called neuroplasticity. London taxi drivers, for instance, have enlarged hippocampi (the brain region for memory and spatial processing)—a neural reward for paying attention to the tangle of the city’s streets. As we become more skilled at the 21st-century task Meyer calls “flitting,” the wiring of the brain will inevitably change to deal more efficiently with more information. The neuroscientist Gary Small speculates that the human brain might be changing faster today than it has since the prehistoric discovery of tools. Research suggests we’re already picking up new skills: better peripheral vision, the ability to sift information rapidly.
  **
The human brain has been exposed to many challenging stimuli in the past, horrific wars, natural disasters, personally traumatic experiences, etc.  Perhaps it is the old saying "whatever does not kill you makes you better" equally applies to the physiology of the brain.  I think it is clear that any device (let us call the brain that for now) that can adjust in real time and compensate for unexpected events has the best chance of surviving the relentless onslaught of innovation and change that technology brings us.

Yet the most compelling reason why multitasking however you define is here to stay is because the information age demands it.  David Freeman wrote an article in 2007 named, What's Next: Taskus Interruptus he explained it succinctly.
So what gives? Does multitasking really impair our ability to get our jobs done? The answer for most workers is, I think, no. But it's not because multitasking doesn't impair your ability to perform tasks. It does. It's because we're now in a complex, fast-response world in which getting a complete task done in the least amount of time is no longer the priority. Instead, today's top priority is to immediately address whatever fraction of a vast, malleable range of tasks has become most critical--a just-in-time, networked workstyle. Focusing on one task to the exclusion of others isn't even an option anymore. When experts examine the detrimental effects of multitasking on productivity, they're asking the wrong question. We don't need to wonder about the ways in which multitasking and interruption impair our ability to speed through a task. We need to appreciate the ways in which multitasking and interruption have become essential to meeting the increasingly nonlinear demands of our jobs.
He goes on to say,
...businesses have long been moving away from the sort of stovepipe structure that allowed employees to focus on meeting the demands of a single boss or worry only about a small group of employees or customers. Today the dotted-line relationships form a dense web that extends out to customers, suppliers, and partners. In other words, forget about closing the door and crunching on that one presentation. You've got 20 other people breathing at you just as hard, and each one wants to know that you're making progress. "The way we look at getting the job done is changing," says Martin Frid-Nielsen, CEO of Soonr, a Campbell, California, company offering a service that connects cell phones to PC applications. "It's about how in touch you are and how you're engaging many other people."
 So what is all this devicecification doing to us? I suspect many things. It is turning us into part machine. Multitasking, is a lot like the way a computer processor does things. Computers cannot really multitask, if by multitasking we mean the doing two things at the precise nanosecond. They can just switch tasks really quickly giving the appearance of multitasking. This process is perceived as multitasking. No supporters of multitasking would assert that they are doing two things at the PRECISE same moment. They are quickly switching between tasks.  Again in the analogy between the onset of computers and the invention of the printing press there is another parallel,
In the information age, then, we are already seeing some examples of the dominance of unintended consequences over intended ones. Particularly in light of similarities with the printing press era, it would be surprising indeed if there weren't more dominating unintended consequences to come. This is not to suggest that we shouldn't take action for fear of unintended consequences. Far from it. It is to suggest that we are beyond our cause-and-effect reasoning abilities when it comes to networked computers and that we should be prepared for--and actively seeking--unintended consequences of actions we do take.
 ***
It is creating new matrix-grid in which we move, breathe and have our being.  We travel within this matrix-grid and depend on it. And like a wild animal that has been domesticated for generations, if we loose our domesticated lifestyle, we may not be able to survive in the wild. Either way, these concerns do not matter. "Progress" will move on. This technological tsunami will not be stopped by any critics. It will still hit us with its full force. Like any great force of nature, we must learn to live with it rather than attempt to stop it.  If indeed we are living in the matrix, then the best way improve it is to live inside it rather than try to escape it.  You cannot change something if you are outside the process.  And this devicecification if it indeed as great as purported to be cannot be left to chance and to "others," we ALL must become part of the process.




3 comments:

Anonymous said...

There is a pretty good book that gets into much of this called "iBrain" by Gary Small. I find it funny that the reason that I began reading "iBrain" is because I am extremely curious about how much less people read now a days. And they seem to read less and less Asa the amount of technology increases. I've been thinking for a few years now that soon reading will become more of a specialty rather than a commonality. Unfortunately, the book didn't cover that as much as I hoped. However, it does mention it briefly. And now this blog mentions it somewhat when talking about the printing press. I wonder if other people have the same feeling I do. After all people used to read mostly to either learn something or entertain themselves. Now if people want to learn they turn to Youtube, and if they want entertainment they have movies and television. So I guess the only question left is not IF this will happen, but how long it WILL take?

Anonymous said...

I just signed up to your blogs rss feed. Will you post more on this subject?

Unknown said...

I assume you are the same person. It is the same answer. Is there something in the article you wanted to learn more about?

AI & MEDICINE


 See These Pages: FUTURISM TECH TRENDS SINGULARITY SCIENCE CENSORSHIP SOCIAL NETWORKS eREADERS MOBILE DEVICES 
 Coming soon.