Monday, March 26, 2012

Competing Visions of a Computer-Controlled Future ? AntiWorldNews

Computers dominate how we live, work and think. For some, the technology is a boon and promises even better things to come. But others warn that there could be bizarre consequences and that humans may be on the losing end of progress.

Federico Faggin has lived in the United States for more than 40 years, but he?s still living?la dolce vita?in classic Italian style in his magnificent house on the edge of Silicon Valley. The elderly Faggin answers the phone with a loud ?pronto? and serves wine and antipasti to guests. Everything about him is authentic. The only artificial thing in Faggin?s world is what he calls his ?baby.? It has 16 feet ? eight on each side ? and sits wrapped in cotton in a cigarette case.

?About four decades ago, Faggin was one of the first employees at Intel when he and his team developed the world?s first mass-produced microprocessor, the component that would become the heart of the modern era. Computer systems are ubiquitous today. They control everything, from mobile phones to Airbus aircraft to nuclear power plants. Faggin?s tiny creation made new industries possible, and he has played a key role in the progress of the last few decades. But even the man who triggered this massive revolution is slowly beginning to question its consequences.

?We are experiencing the dawn of a new age,? Faggin says. ?Companies like Google and Facebook are nothing but a series of microprocessors, while man is becoming a marginal figure.?

The Worrying Speed of Progress

This week, when German Chancellor Angela Merkel and Google chairman Eric Schmidt opened CeBIT ? the digital industry?s most important annual trade fair ? in the northern German city of Hanover, there was a lot of talk of the mobile Internet once again, of ?cloud computing,? of ?consumer electronics? and of ?connected products.? The overarching motto of this convention is ?Trust? ? in the safety of technology, in progress and in the pace at which progress unfolds.

This effort to build trust seems more necessary than ever, now that those who place their confidence in progress are being joined by skeptics who also see something dangerous about the rapid pace of development.

In his book ?The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future,? American computer scientist Martin Ford paints a grim picture. He argues that the power of computers is growing so quickly that they will be capable of operating with absolutely no human involvement at some point in the future. Ford believes that 75-percent unemployment is a possibility before the end of the century.

?Economic progress ultimately signifies the ability to produce things at a lower financial cost and with less labor than in the past,? says Polish sociologist Zygmunt Bauman. As a result, he says, increasing effectiveness goes hand in hand with rising unemployment, and the unemployed merely become ?human waste.?

Likewise, in their book ?Race Against the Machine,? Erik Brynjolfsson and Andrew McAfee, both scholars at the Massachusetts Institute of Technology (MIT), argue that, for the first time in its history, technological progress is creating more jobs for computers than for people.

Transforming Industries and Lives

The information-technology sector is indeed becoming increasingly important. In the 34 countries of the Organization for Economic Co-operation and Development (OECD), the club of industrialized nations, some 16 million people work in this field ? a figure that does not include important manufacturing countries such as India and China. The trillions in IT sales worldwide already exceed those of other key industrial sectors, such as the chemical-pharmaceutical and auto industries.

At the same time, more and more jobs are being lost in traditional industries. According to Brynjolfsson and McAfee, the most recent economic crisis in the United States and the collapse of many markets in 2008 forced US companies to make massive layoffs. Although production had returned to pre-crisis levels by the fall of 2011, it did so with 7 million fewer workers.

The new technology is shaking up all old industries, changing their products, revolutionizing work processes and transforming companies. Digitization isn?t just changing work; it is also profoundly altering the way people think, act and live in their daily lives.

Eliminating Inefficiencies

A ghost car is driving through the streets of San Francisco. An ordinary mint-green Toyota Prius is negotiating the sharp, downhill curves at the end of Lombard Street. But there is no driver. Instead, there are numerous cameras installed on the Toyota that scan its surroundings in all directions. Processors calculate how far away obstacles, traffic lights and pedestrians are from the vehicle and, based on this information, they send commands to levers that push the gas pedal, press on the brake and turn the steering wheel. It?s a car that has essentially taught itself to drive.

The vehicle is part of a revolution being spearheaded by a team of 15 researchers at the headquarters of Internet giant Google in the Silicon Valley town of Mountain View. The team is headed by Sebastian Thrun, a native of the western German city of Solingen who has been a professor of artificial intelligence at Stanford University for the last eight years.

Thrun is working on a vision of the automobile of the future, one in which people will eventually no longer have to own their own cars. Thrun cites some enlightening figures. The average car, he says, actually spends only 3 percent of its life in motion, while it stands around doing nothing for the remaining 97 percent of the time. The average street, Thrun says, is only used at 4 percent of its capacity, and yet new roads are constantly being built.

?It?s about eliminating society?s inefficiencies,? Thrun says.

Yielding Control to Computers

So why not develop a car that you order with your cell phone and that arrives on its own controlled by computers? Why not control the flow of traffic in an intelligent, automated way that reduces congestion?

The example of the car says a lot about the issues of freedom and the loss of control in the digital age. Human error can have dramatic consequences on the road. In the Western world, traffic accidents are one of the most common causes of death. ?Technology is superior to human beings,? Thrun says.

Still, he doesn?t believe in vehicles that communicate with one another, saying that it ?would take forever until all cars are truly linked.? On the contrary, Thrun argues, cars should remain autonomous, but they should also learn and understand the rules of the road and recognize when another car unexpectedly appears or a construction site suddenly turns up in the middle of the street. In Thrun?s opinion, cars should be intelligent ? that is, artificially intelligent.

It doesn?t trouble Thrun that this would mean allowing machines to control people and to deprive them of their autonomy. He wants to design a new automobile that virtually rules out the possibility of technical failure. Under certain weather conditions, pilots in the United States are no longer allowed to land their aircraft on their own. In this case, Thrun says, the machine has already replaced the human being. ?So, does that mean we don?t get on planes anymore?? he asks.

Computers on Four Wheels

The Google car isn?t ready for mass production yet. It has already logged well over 100,000 miles (160,000 kilometers) on public roads in the United States, and the use of such autonomous cars is already permitted under certain circumstances in the southwestern state of Nevada.

As an example from Volkswagen demonstrates, these engineering efforts aren?t just for sport. The company is developing a van that drives itself and can be summoned with a cell phone. ?Such technologies will likely be standard by 2020 at the latest,? says Ulrich Hackenberg, Volkswagen?s head of development.

Indeed, cars have been rolling computers for some time, and new ones come equipped with dozens of processors. If we would let them, they could use facial-recognition technology to determine who is driving and an iris scanner to detect driver fatigue. They could also be programmed to automatically slow down to comply with low speed limits. The fact that none of this is happening yet has to do with the seemingly banal question of who is legally liable when something does go wrong: the carmaker, the chip manufacturer or the driver?

In England, the company Cooperative Insurance launched a program a year ago in which about 100,000 new drivers have agreed to be monitored by satellite. Now even the British Automobile Association has adopted the model. Drivers who agree to install a GPS unit in their cars that will monitor their speed, their braking behavior and how quickly they navigate curves receive a discount on their insurance premium.

When the insurance company Norwich Union tested the system with 15,000 drivers, the number of claims dropped by up to 20 percent. Since drivers know they?re being monitored as part of the study, they drive more carefully. Indeed, the chips haven?t just changed the cars; they altered driver behavior, as well.

Part 2: The Perils of Putting Computers in Charge

Germany?s most unusual television studio is a place where the computer has already outpaced human beings. The VIP tour of the Frankfurt Stock Exchange costs ?125 ($165). From the gallery, the viewer looks down at the trading floor where the traders sit behind a wall of computer monitors. It?s an image of absolute control ? but it?s also an illusion.

Some 4,500 traders are registered with Deutsche B?rse AG, the company that owns and operates the exchange, but only about 100 are physically present in the room. They are part of a living stage set for the TV cameras, specialists in types of securities that are traded in such small quantities that it doesn?t make sense to have computers handle them.

In recent years, there is hardly any other area that digitization has changed as dramatically as the financial markets. Since the establishment of the first stock exchanges in the Middle Ages, speed has been the measure of all things.

Today, the financial markets consist of a digital network in which machines communicate with machines. Traders haven?t been able to keep up with the speed of computers for a long time. The blink of a human eye takes about 150 milliseconds. Computers on German stock exchanges can fill about 300 orders in the same amount of time.

Within milliseconds, the super-fast computers of so-called high-frequency traders buy and sell stocks around the world when they detect price fluctuations of only fractions of a cent. At the same time, the work of these computers increases the risk of an unwanted crash because buying and selling is no longer controlled by human reason alone, but also by the programmed logic of machines.

This occasionally leads to bizarre consequences. On May 6, 2010, the Dow Jones Industrial Average index on the New York Stock Exchange plummeted by almost 1,000 points in less than 20 minutes. In that time, close to 1.3 billion shares were traded, or six times the average trading volume. Some stocks lost up to 99 percent of their value.

The so-called ?flash crash? was over after a short time, and the index recovered. But, even today, the causes have experts still scratching their heads. It is likely that the chaos was partly owed to an avalanche of sales by computer-controlled trading programs that sell a share as soon as its value has reached a pre-defined price. Indeed, the incident illustrates a fundamental problem: The supposedly intelligent computers all think in the same direction and, as a result, can promptly lead lemmings over a cliff.

Winners and Losers

A British government study on the future of computerized trading concludes that the financial markets are on the verge of radical change and predicts that the number of traders will drastically decline over the next decade. ?The simple fact is that we humans are made from hardware that is just too bandwidth-limited, and too slow, to compete with coming waves of computer technology,? the study says.

As long ago as 1965, Gordon Moore, who would later go on to cofound Intel, predicted that the performance and component density of processors would rapidly develop. Over the last four decades, the number of transistors in a processor has doubled about once every 18 to 24 months, resulting in rapid changes in processing speed. The maximum computing speed of the first microchip was 740 kilohertz compared with the standard speed of about 3 gigahertz today, representing a more than 4,000-fold increase.

While this leads to the creation of new jobs in the digital economy, jobs in traditional industries are being cut. Granted, up to 6 million new IT jobs are expected between 2010 and 2014. But Foxconn, a Taiwan-based manufacturer to which IT companies outsource much of their production, plans to purchase a million robots over the next three years to replace some of its more than million employees.

It?s a paradox. On the one hand, digitization increases growth and prosperity. On the other, write MIT scholars Brynjolfsson and McAfee, ?There is no economic law that says that everyone, or even most people, automatically benefit from technological progress.?

The Computer Age Divide

Life in the digital world doesn?t just change our behavior; it also changes how we learn and think. Children are growing up in a world in which the distinctions between real and simulated life, as well as between machines, humans and animals, are starting to disappear, concludes Sherry Turkle, a professor of Social Studies of Science and Technology at MIT.

Indeed, the behavior of small children can reveal whether their parents own iPhones and iPads. These are the children who spread their fingers across paper photo albums when they want to enlarge the images or drag their fingers across television screens when they?re bored by a cartoon they?re watching.

According to a recent study by Columbia University psychologist Betsy Sparrow, a person who knows that he or she can readily look up a piece of information online doesn?t remember it as well as someone without Internet access. The study finds that the human brain treats the Internet as an extension of itself, as a kind of external memory. Ideally, this means that trivial knowledge can be stored in this external memory, freeing up brain space for creativity. But, in the worst case, the computer becomes a prosthetic brain.

?In terms of IT and computer use, there is an enormous divide between those born before 1970 and after 1980,? Moshe Rappoport of IBM Research, the US company?s European research center, concluded in 2008. ?The former will remain digital immigrants for the rest of their lives.? According to Rappoport, most young people have already logged thousands of computer-game hours by the time they?re 20, thereby acquiring skills and thought patterns completely foreign to the older generation.

Rappoport also argues that the change in the use of technology has had immense impacts on established companies and economic sectors. In computer games, one can quickly reach the goal through risky behavior and then simply start over again. In a similar way, the younger generation is characterized by a willingness to take risks. ?Nowadays, 25-year-olds who have already established six or seven companies are no longer a rarity,? Rappoport said. ?In the past, a business idea was considered a failure if it stopped working after two year. But, today, it?s much more about trying out ideas, implementing them and then discarding them again.?

The Game-ification of Society

Fifteen years ago, Andreas Lange, a pioneer in Germany?s gaming culture, opened the first?museum devoted to computer games?in Berlin. He takes visitors on a tour of old arcade games, like Spacewar!, Pong and Tetris. ?What we?re exhibiting here was once the hobby of a small minority,? Lange says. ?Today, on the other hand, we are seeing a game-ification of the entire society, from the markets to politics to war.?

The stakes are high everywhere today, Lange says. ?But,? he notes, ?games are only games if there is a boundary between the game and the world outside the game.?

With sales of about $70 billion, the gaming industry has become an important global economic sector. The?MIT Technology Review?counts software developer Torsten Reil among the world?s 100 most important innovators. Reil, who is almost unknown in his native Germany, has revolutionized the world of movies. He initially studied biology at the University of Oxford, where he was interested in neurological questions, such as why we don?t fall over when we walk and how the brain coordinates muscles and joints when we avoid obstacles. He and his colleagues developed software that would let them simulate their research on a computer. This, in turn, led to artificial beings that move in lifelike ways.

Reil recognized the program?s potential and founded the software company NaturalMotion. Since then, he has instilled something deeply human into digital beings and simulated film actors: emotions. For example, they react with interest when a person looks at them and dismissively when something isn?t to their liking.

Director Wolfgang Petersen used the technology for his film ?Troy.? Reil?s software Euphoria Engine was used in ?Lord of the Rings? and in video games such as ?Grand Theft Auto.? He is now developing games for the iPhone, such as ?My Horse,? which has been downloaded for free millions of times but incurs costs when users want game enhancements.

Nowadays, games and their user interfaces have found their way into almost every part of the economy. For instance, cars like the hybrid Honda Insight display flowers and medals on a screen as rewards for energy-conscious driving.

The Gartner market research company predicts that, within a few years, the economic importance of game-based advertising will be similar to that of Facebook today. But, in the 21st century, the real question is: Exactly who is playing with whom?

Part 3: The Narrowing Divide between Man and Machine

Helmut Dubiel is leaning back into the sofa in the living room of his row house in the Bockenheim district of Frankfurt. The sociologist is 65 and incurably ill. He has Parkinson?s disease.

Every movement on the sofa is difficult for Dubiel, and his voice is barely more than a whisper. ?I?ve had Parkinson?s since the early 1990s, when I was 46,? Dubiel says. At the time, he was at the height of his career as the director of the city?s Institute for Social Research.

At first, he ignored the disease, but then he began fighting it by taking up to 30 pills a day. Dubiel eventually faced the question of ?whether I should have a brain pacemaker inserted.?

In 2003, two electrodes were implanted deep into Dubiel?s brain in a 10-hour operation, during which he was fully conscious. Wires connect the electrodes to a subdermal control unit on the right side of his chest. Using a small remote-control device, Dubiel can modify the pulse strength, thereby stimulating the affected parts of the brain. He can decide whether to improve his ability to walk or speak. The more comprehensible Dubiel?s speech becomes, the more he loses control over the rest of his body, and vice versa.

Dubiel?s brain pacemaker is known as a neuro-implant. It isn?t the only device of its kind currently in use. Cochlear implants for the deaf, for example, convert tones and sounds into electric signals and transmit them through an electrode to the hearing nerve in the brain. Millimeter-sized chips are implanted under the retina of blind people, converting light into nerve impulses. Neuro-stimulators implanted into patients with chronic pain can temporarily shut their nerves off. Conversely, the nerves of paralyzed patients can control artificial prostheses.

Computer technologies have been a boon to medicine and of great benefit to human beings. But the advances also illustrate that the divide between man and machine is becoming narrower. Neuro-implants define this boundary because they entail having a machine penetrate into the human body. Although today?s instruments are still relatively crude, brain pacemakers are already being used in patients with depression and obsessive-compulsive disorder. In this way, machines are no longer just intervening in the body?s mechanical functions, but also in its emotional life.

After his operation, Dubiel wrote a book about his disease. He criticized the surgery at first because he subsequently had trouble speaking and his handwriting became illegible. He had to learn to control the intervention into his own brain. Today, he says that he would have the operation again if given the chance. ?We have to learn how to deal with this technology,? he says.

The Need for Political Adjustments

This also applies to politics, in particular. Digitization has been changing everyday life, affluence, work, living, loving and recreation for more than 40 years. But the political world has often failed to keep up when it comes to creating adequate basic rules, laws, supervision and management of the changes.

The issue isn?t something senior politicians address in Berlin, either. Cornelia Rogall-Grothe, a state secretary in the German Interior Ministry and the government?s coordinator for information technology, says things like: ?The Federal Ministry of the Interior is currently developing the draft of a federal e-government law with the objective of facilitating electronic communication with the administration for citizens and the economy.?

Thus, while we download music and do our banking online, and computers control entire factories, the German government is merely developing draft legislation.

Politicians should be careful not to be overrun by developments, just as some companies have been. In the early 1970s, shortly after the invention of his microchip, Federico Faggin paid a visit to Nixdorf, a computer manufacturer in the northern German city of Paderborn. He wanted to present his innovative integrated processor to the Germans and offer it to them. But the German engineers declined, telling Faggin that there was no money to be made with his little toy. Today, there are no longer any computers under the brand name Nixdorf.

By Markus Dettmer, Hilmar Schmundt and?Janko Tietz

http://www.spiegel.de/

AntiWorldNews?

Source: http://antiworldnews.wordpress.com/2012/03/25/competing-visions-of-a-computer-controlled-future/

melasma jimmy rollins jimmy rollins let it snow jason trawick jerry lewis tampa bay bucs

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.