66

The Next Digital Decade

Embed Size (px)

DESCRIPTION

the future of computer!

Citation preview

Page 1: The Next Digital Decade
Page 2: The Next Digital Decade

THE NEXT DIGITAL DECADE_

Page 3: The Next Digital Decade

THE NEXT DIGITAL DECADE_

CHIN-CHEN CHAUNG

Page 4: The Next Digital Decade

INTRODUCTION0 ABOUT0.1 WHAT Is A COmpUTER?0.2 BRIEf HIsTORy0.3 HERE Is THE BEGINNING1 AppEARANCE1.1 sIzE 1.2 DEsIGN1.3 sHApE1.4 mATERIALs2 fUNCTIONALITy2.1 AppLICATION sOfTWARE2.2 COmmUNICATION2.3 NATURE UsER INTERfACE3 HUmANITy3.1 TRANsfORm3.2 NEW LANGUAGE NEW vALUEs3.3 DIGITAL NARCIssIsm3.4 BRAIN CHANGE

CONTENTs

Page 5: The Next Digital Decade

THE NEXT DIGITAL DECADE

CONTENTs

Page 6: The Next Digital Decade

INTRO DUCTION_

If you ask me why I choose this topic, I will say that it is not my decision. In the beginning, I have no idea what is future of computer. Because of this project, I started to research more about development of technology. After researching, I am surprised that it is not bored as my imagination.

This book is about the future of computer. I will talk about the development of computer from the past until now. Then, I will focus on the humanity part in the future of computer. The audiences are the people who are interested in comput-ers, and who have the abilities to achieve the future of computer.

Page 7: The Next Digital Decade

THE NEXT DIGITAL DECADE

INTRO DUCTION_

Page 8: The Next Digital Decade

0_

Page 9: The Next Digital Decade

THE NEXT DIGITAL DECADE 0_ [A

BOUT]

Page 10: The Next Digital Decade
Page 11: The Next Digital Decade

THE NEXT DIGITAL DECADE [ABOUT]

Page 12: The Next Digital Decade

WHAT IS A COMPUTER?

Page 13: The Next Digital Decade

THE NEXT DIGITAL DECADE

0.1

[ABOUT]

Page 14: The Next Digital Decade

Binary Code

1703

Charles Babbage

1822

Herman

Hollerith

1890

Konrad zuse z3

1941

Atanasoff—

Berry computer

1942

Alan Turing

Colossus

1944

John von Neumann

1945

William

shockley

first Transitor

1947

UNIvAC

1951

Jack Kilby

Robert Noyce

Integrated Circuit

1958

Douglas

Englebart

1964

Ted Hoff

1968

Xerox Alto

1973

Altair 8800

1975

steve Jobs & steve

Wozniak Apple

Computer Is founded

1976

Apple II

1977

DOs Operating

system

first IBm pC first

Laptop

1981

Page 15: The Next Digital Decade

THE NEXT DIGITAL DECADE

0.2

Binary Code

1703

Charles Babbage

1822

Herman

Hollerith

1890

Konrad zuse z3

1941

Atanasoff—

Berry computer

1942

Alan Turing

Colossus

1944

John von Neumann

1945

William

shockley

first Transitor

1947

UNIvAC

1951

Jack Kilby

Robert Noyce

Integrated Circuit

1958

Douglas

Englebart

1964

Ted Hoff

1968

Xerox Alto

1973

Altair 8800

1975

steve Jobs & steve

Wozniak Apple

Computer Is founded

1976

Apple II

1977

DOs Operating

system

first IBm pC first

Laptop

1981

[ABOUT]

Page 16: The Next Digital Decade

BRIEf HISTORy

Webster’s Dictionary defines “com-puter” as any programmable elec-tronic device that can store, retrieve, and process data. The basic idea of computing develops in the 1200’s when a Moslem cleric proposes solv-ing problems with a series of written procedures.

As early as the 1640’s mechanical cal-culators are manufactured for sale. Records exist of earlier machines, but Blaise Pascal invents the first com-mercial calculator, a hand powered adding machine. Although attempts to multiply mechanically were made by Gottfried Liebnitz in the 1670s the first true multiplying calculator ap-pears in Germany shortly before the American Revolution.

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading punched holes stored on small sheets of hardwood. These plates are then inserted into the loom which reads (retrieves) the pattern and creates(process) the weave. Pow-ered by water, this “machine” came 140 years before the development of the modern computer.

Shortly after the first mass-produced calculator(1820), Charles Bab-bage begins his lifelong quest for a programmable machine. Although Babbage was a poor communicator and record-keeper, his difference

engine is sufficiently developed by 1842 that Ada Lovelace uses it to mechanically translate a short written work. She is generally regarded as the first programmer. Twelve years later George Boole, while profes-sor of Mathematics at Cork University, writes An Investigation of the Laws of Thought(1854), and is generally recog-nized as the father of computer science.

The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to create weaves. Developed by Herman Hollerith

of MIT, the system uses electric

power(non-mechanical). The Hollerith

Tabulating Company is a forerunner of

today’s IBM.

Just prior to the introduction of Hollerith’s machine the first printing calculator is introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially successful printing calculator. Although hand-powered, Burroughs quickly introduces an electronic model.

In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a machine he calls the differential analyzer. Using a set of gears and shafts, much like Babbage, the machine can handle simple calculus problems, but accuracy is a problem.

Page 17: The Next Digital Decade

THE NEXT DIGITAL DECADE

first macIntosh

Computer

1984

Windows

Operating system

1985

microsoft

vs

Apple

1988

[ABOUT]

Page 18: The Next Digital Decade

Alan Turing, conceive machines to handle the problem, but none are technically feasible. Turing proposes a “Universal Machine” capable of “computing” any algorithm in 1937. That same year George Steblitz creates his Model K(itchen), a con-glomeration of otherwise useless and leftover material, to solve complex calculations. He improves the design while working at Bell Labs and on September 11, 1940, Steblitz uses a teletype machine at Dartmouth College in New Hampshire to transmit a problem to his Complex Number Calculator in New York and receives the results. It is the first example of a network.

First in Poland, and later in Great Britain and the United States, the Enigma code is broken. Information gained by this shortens the war. To break the code, the British, led by Touring, build the Colossus Mark I. The existence of this machine is a closely guarded secret of the British Government until 1970. The United States Navy, aided to some extent by the British, builds a machine capable of breaking not only the German code but the Japanese code as well.

The period from 1935 through 1952 gets murky with claims and counter-claims of who invents what and when. Part of the problem lies in the inter-national situation that makes much of the research secret. Other problems include poor record-keeping, decep-tion and lack of definition.

In 1935, Konrad Zuse, a German construction engineer, builds a mechanical calculator to handle the math involved in his profession. Shortly after completion, Zuse starts on a programmable electronic device which he completes in 1938.

John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry assists. The “ABC” is designed to solve linear equations common in physics. It displays some early features of later computers including electronic calcu-lations. He shows it to others in 1939 and leaves the patent application with attorneys for the school when he leaves for a job in Washington during World War II. Unimpressed, the school never files and ABC is can-nibalized by students.

The Enigma, a complex mechanical encoder is used by the Germans and they believe it to be unbreakable. Several people involved, most notably

Page 19: The Next Digital Decade

THE NEXT DIGITAL DECADE

“ImAGINATION Is mORE ImpORTANT THAN KNOWLEDGE. fOR KNOWLEDGE Is LImITED, WHEREAs ImAGINATION EmBRACEs THE ENTIRE WORLD, sTImULATING pROGREss, GIvING BIRTH TO EvOLUTION.” —ALBERT EINsTEIN

In 1943 development begins on the Electronic Numerical Integrator And Computer (ENIAC) in earnest at Penn State. Designed by John Mauchly and J. Presper Eckert of the Moore School, they get help from John von Neumann and others. In 1944, the Havard Mark I is introduced. Based on a series of proposals from Howard Aiken in the late 1930’s, the Mark I computes complex tables for the U.S. Navy. It uses a paper tape to store instructions and Aiken hires Grace Hopper(“Amazing Grace”) as one of three programmers working on the machine. Thomas J. Watson Sr. plays a pivotal role involving his company, IBM, in the machine’s development.

Early in 1945, with the Mark I stopped for repairs, Hopper notices a moth in one of the relays, possibly caus-ing the problem. From this day on, Hopper refers to fixing the system as “debugging”. The same year Von Neumann proposes the concept of a “stored program” in a paper that is never officially published.

Work completes on ENIAC in 1946. Although only three years old the ma-chine is woefully behind on technol-ogy, but the inventors opt to continue while working on a more modern machine, the EDVAC. Programming ENIAC requires it to be rewired. A later version eliminates this problem. To make the machine appear more

[ABOUT]

Page 20: The Next Digital Decade

impressive to reporters during its unveiling, a team member (possibly Eckert) puts translucent spheres(halved ping pong balls) over the lights. The US patent office will later recognize this as the first computer.

The next year scientists employed by Bell Labs complete work on the transistor (John Bardeen, Walter Brattain and William Shockley receive the Nobel Prize in Physics in 1956), and by 1948 teams around the world work on a “stored program” machine. The first, nicknamed “Baby”, is a prototype of a much larger machine under construction in Britain and is shown in June 1948.

The impetus over the next 5 years for advances in computers is mostly the government and military. UNIVAC, de-livered in 1951 to the Census Bureau, results in a tremendous financial loss to its manufacturer, Remington-Rand. The next year Grace Hopper, now an employee of that company proposes “reuseable software,” code segments that could be extracted and assembled according to instructions in a “higher level language.” The concept of compiling is born. Hopper would revise this concept over the next twenty years and her ideas would become an integral part of all modern computers. CBS uses one of the 46 UNIVAC computers produced to predict the outcome of the 1952

Presidential Election. They do not air the prediction for 3 hours because they do not trust the machine.

IBM introduces the 701 the following year. It is the first commercially suc-cessful computer. In 1956 FORTRAN is introduced(proposed 1954, it takes nearly 3 years to develop the com-piler). Two additional languages, LISP and COBOL, are added in 1957 and 1958. Other early languages include ALGOL and BASIC. Although never widely used, ALGOL is the basis for many of today’s languages.

With the introduction of Control Data’s CDC1604 in 1958, the first transistor powered computer, a new age dawns. Brilliant scientist Seymour Cray heads the development team. This year inte-grated circuits are introduced by two men, Jack Kilby and John Noyce, work-ing independently. The second network is developed at MIT. Over the next three years computers begin affecting the day-to-day lives of most Americans. The addition of MICR characters at the bottom of checks is common.

Page 21: The Next Digital Decade

[ABOUT]

THE NEXT DIGITAL DECADE

Page 22: The Next Digital Decade

In 1961 Fairchild Semiconductor intro-duces the integrated circuit. Within ten years all computers use these instead of the transistor. Formally building sized computers are now room-sized, and are considerably more powerful. The following year the Atlas becomes operational, displaying many of the features that make today’s systems so powerful including virtual memory, pipeline instruction execution and paging. Designed at the University of Manchester, some of the people who developed Colossus thirty years earlier make contributions.

On April 7, 1964, IBM introduces the System/360. While a technical marvel, the main feature of this machine is business oriented...IBM guarantees the “upward compatibility” of the sys-tem, reducing the risk that a business would invest in outdated technology. Dartmouth College, where the first network was demonstrated 25 years earlier, moves to the forefront of the “computer age” with the introduction of TSS(Time Share System) a crude(by today’s standards) networking system. It is the first Wide Area Network. In three years Randy Golden, President and Founder of Golden Ink, would begin working on this network.

Within a year MIT returns to the top of the intellectual computer community with the introduction of a greatly refined network that features shared

MORERECEnTAdvAnCES

resources and uses the first mini-computer (DEC’s PDP-8) to manage telephone lines. Bell Labs and GE play major roles in its design.

In 1969 Bell Labs, unhappy with the direction of the MIT project, leaves and develops its own operating system, UNIX. One of the many precursors to today’s Internet, ARPANet, is quietly launched. Alan Keys, who will later become a designer for Apple, proposes the “personal computer.” Also in 1969, unhappy with Fairchild Semiconductor, a group of technicians begin discussing forming their own company. This com-pany, formed the next year, would be known as Intel. The movie Colossus:The Forbin Project has a supercomputer as the villain. Next year, The Computer Wore Tennis Shoes was the first feature length movie with the word computer in the title. In 1971, Texas Instruments introduces the first “pocket calculator.” It weighs 2.5 pounds.

With the country embroiled in a crisis of confidence known as Watergate, in 1973 a little publicized judicial decision takes the patent for the computer away from Mauchly and Eckert and awards it to Atanasoff. Xerox introduces the mouse. Proposals are made for the first local area networks.

Page 23: The Next Digital Decade

THE NEXT DIGITAL DECADE

In 1975 the first personal computer is marketed in kit form. The Altair fea-tures 256 bytes of memory. Bill Gates, with others, writes a BASIC compiler for the machine. The next year Apple begins to market PC’s, also in kit form. It includes a monitor and keyboard. The earliest RISC platforms become stable. In 1976, Queen Elizabeth goes on-line with the first royal email message.

During the next few years the personal computer explodes on the American scene. Microsoft, Apple and many smaller PC related companies form (and some die). By 1977 stores begin to sell PC’s. Continuing today, companies strive to reduce the size and price of PC’s while increasing capacity. Entering the fray, IBM introduces it’s PC in 1981(it’s actually IBM’s second attempt, but the first failed miserably). Time selects the computer as its Man of the Year in 1982. Tron, a computer-generated special effects extravaganza is released the same year.

[ABOUT]

Page 24: The Next Digital Decade

HERE IS THE BEgInnIng

Page 25: The Next Digital Decade

THE NEXT DIGITAL DECADE

“THAT’s WHAT’s COOL ABOUT WORKING WITH COmpUTERs. THEy DON’T ARGUE, THEy REmEmBER EvERyTHING, AND THEy DON’T DRINK ALL yOUR BEER.” —pAUL LEARy

HERE IS THE BEgInnIng

Today is Monday morning, someone is jogging on the road. His name is Gary, Winston, and lives in 2020. Suddenly, he picks up his phone by his voice from his watch-phone and at the same time music which from his watch automati-cally stopped.

“Boss, I need you to be at home right now! The video meeting will be started in 20 minutes.”

“O.K. I got it. See you then. Good bye!”

When he arrives, the light and all electronics are opened automatically. Then, he speaks to television and asks it to connect to his meeting with his employ-ees. Next, he puts his keychain on the screen to download the documents of meeting. After meeting, he decides to go to office, he doesn’t need to touch any thing, everything controls by his voice. The only one thing he needs is communicate with his car...

0.3

[ABOUT]

Page 26: The Next Digital Decade

1_

Page 27: The Next Digital Decade

THE NEXT DIGITAL DECADE

1_[AppEARANCE]

Page 28: The Next Digital Decade
Page 29: The Next Digital Decade

THE NEXT DIGITAL DECADE [AppEARANCE]

Page 30: The Next Digital Decade

1.1

LESS IS MORE

The electronics industry underwent a fundamental change with the invention of the transistor, paving the way for the microelectronics revolution. Devices are now constructed in complex arrays using lithographic methods derived from classical photography. However, as the size of the circuit shrinks, the resolution of the features that can be fabricated by this so-called “top-down” methodology reaches a limit - currently about 100 nanometres.

If computers are to become even smaller - and hence faster - the size of the circuit must be reduced towards the “nanoscale” domain. One prospect is the so-called “bottom-up” approach in which smaller electronic components are constructed from even smaller building blocks, preferably by a process of chemical self-assembly.

Each spacer molecule included a central chemical group, related to the herbicide Paraquat. The group can eas-ily accept one or two electrons, thereby changing its electronic state. This chemistry allowed a group of spacer molecules to behave as controllable molecular wires, connecting the gold nanoparticle to the base contact.

The number of electrons per nanopar-ticle required to achieve each of these

effects is probably less than 30. Our report shows that it is possible to control electrical conduction through these structures by the electronic state of the spacer molecules.

These research efforts represent a step towards decreasing the size of electronic components using chemi-cal means. The challenges for further development are twofold. The design and creation of chemical structures that are able to organise themselves will permit the construction of more elaborate nanoscopic devices by self-assembly. There is also a need for the development of methods by which these elements may be connected together and interfaced to the mac-roscopic world using the engineering nano-fabrication techniques that are becoming available.

The future will probably involve the conjunction of the two approaches to nanotechnology: bottom-up, which has its roots in the synthetic and physical chemistry of self-assembled nanostructures, and top-down, representing the engineering approach to the construction of nanoscopic objects.

Page 31: The Next Digital Decade

THE NEXT DIGITAL DECADE

LApTOp DEsIGN Is sET TO CHANGE sIGNIfICANTLy. THEsE CONCEpT NOTEBOOKs pOINT THE WAy.

1.2

COnCEPTUAL dESIgn

The Compenion concept notebook, designed by Felix Schmidberger, eschews the familiar clamshell design in favor of two superbright organic LED panels that slide into place next to each other, making the notebook just three-quarters of an inch thick.

The Canova concept notebook from V12 Design features two touch-sensitive displays. It can be oriented as a traditional laptop for typing or writing, laid flat as a sketch pad or turned on its side as an e-reader.

The Siafu concept notebook, designed for the blind by Jonathan Lucas, omits a display altogether. Images from applications and Web sites are converted into corresponding 3-D shapes on Siafu’s surface. It can be used for reading a Braille newspaper, feeling the shape of someone’s face or going over a tactile representation of a blueprint, for example.

The Cario concept notebook from Anna Lopez can be carried around by its handle, positioned like an easel or placed on a car’s steering wheel. When the car’s not in motion, the Cario can project maps, video conferences and more onto the windshield.

The Solar Laptop Concept, designed by Nikola Knezevic, has an extra hinged lid covered with solar cells that can be adjusted to get the most out of the sun.

[AppEARANCE]

Page 32: The Next Digital Decade

1.3

SHAPEAdvAnCE

The future of the laptop and the tablet PC has been rolled up into one. Ger-man designer Evgeny Orkin has devel-oped a concept design for The Rolltop. The Rolltop is constructed of a flexible OLED display that wraps around the removable power supply stand. Tucked into the power stand is a webcam, speaker sound bar, USB ports, and the power supply and power cord.

The screen can either be formed into a 13-inch conventional laptop or rolled flat for a 17-inch tablet with stylus. Another great feature is you can stand it upright with a fold out support leg and watch videos like a flat screen TV. And with multi-touch technology you don’t need a separate mouse and keyboard because everything is right on the screen. It might be hard to type on a digital QWERTY keyboard but it might be like a large iPhone.

I know its just in the concept form but it would really be cool to see one in person and maybe use one some day. To really appreciate the capabilities and possibilities of The Rolltop check out this video from Orkin Design.

Page 33: The Next Digital Decade

THE NEXT DIGITAL DECADE

[AppEARANCE]

Page 34: The Next Digital Decade
Page 35: The Next Digital Decade

THE NEXT DIGITAL DECADE

“sImpLICITy, CARRIED TO THE EXTREmE, BECOmEs ELEGANCE.” —JON fRANKLIN

1.4

MATERIALWith the environment and sustainabili-ty firmly in mind the Dell Froot concept saves the planet courtesy two projec-tors: One for the virtual keyboard, and another for the monitor.

Designed by Pauline Carlos as part of a sustainability contest sponsored by Dell, the Froot also uses a color-ful case that’s constructed out of a biodegradable starch-based polymer. As it’s a futuristic concept, the lack of a mouse is understabdable—we’ll no doubt be using our brains by then.

More seriously, Pico projectors are *almost* there, but not quite, other-wise I’d be asking why this is still just a concept.

[AppEARANCE]

Page 36: The Next Digital Decade

2_

Page 37: The Next Digital Decade

THE NEXT DIGITAL DECADE

2_[fUNCTIONALITy]

Page 38: The Next Digital Decade
Page 39: The Next Digital Decade

THE NEXT DIGITAL DECADE [fUNCTIONALITy]

Page 40: The Next Digital Decade

In computer science, an application is a computer program designed to help people perform a certain type of work. An application thus differs from an operating system (which runs a com-puter), a utility (which performs mainte-nance or general-purpose chores), and a programming language (with which computer programs are created).

Depending on the work for which it was designed, an application can manipulate text, numbers, graphics, or a combination of these elements. Some application packages offer consider-able computing power by focusing on a single task, such as word processing; others, called integrated software, offer somewhat less power but include several applications. User-written soft-ware tailors systems to meet the user’s specific needs. User-written software include spreadsheet templates, word processor macros, scientific simula-tions, graphics and animation scripts. Even email filters are a kind of user software. Users create this software themselves and often overlook how important it is.

The delineation between system software such as operating systems and application software is not exact, however, and is occasionally the object of controversy. For example, one of the key questions in the United States v. Microsoft antitrust trial was whether Microsoft’s Internet Explorer

APPLICATIOn SOfTWARE

web browser was part of its Windows operating system or a separable piece of application software. As another example, the GNU/Linux naming con-troversy is, in part, due to disagree-ment about the relationship between the Linux kernel and the operating systems built over this kernel.

In some types of embedded systems, the application software and the operating system software may be indistinguishable to the user, as in the case of software used to control a VCR, DVD player or microwave oven. The above definitions may exclude some applications that may exist on some computers in large organiza-tions. For an alternative definition of an application: see Application Portfolio Management.

2.1

Page 41: The Next Digital Decade

THE NEXT DIGITAL DECADE

COMMUNICATION

2.2

[fUNCTIONALITy]

Page 42: The Next Digital Decade
Page 43: The Next Digital Decade

THE NEXT DIGITAL DECADE

Cloud computing is Internet-based computing, whereby shared resources, software and information are provided to computers and other devices on-demand, like a public utility.

It is a paradigm shift following the shift from mainframe to client-server that preceded it in the early ‘80s. Details are abstracted from the users who no longer have need of, expertise in, or control over the technology infrastructure “in the cloud” that supports them. Cloud computing describes a new supplement, consumption and delivery model for IT services based on the Internet, and it typically involves the provision of dy-namically scalable and often virtualized resources as a service over the Internet. It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet.

The term cloud is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the tele-phone network, and later to depict the Internet in computer network diagrams as an abstraction of the underlying in-frastructure it represents. Typical cloud computing providers deliver common business applications online which are accessed from another web service or software like a web browser, while the software and data are stored on servers.

The majority of cloud computing infra-structure consists of reliable services delivered through data centers and built on servers. Clouds often appear as single points of access for all consum-ers’ computing needs. Commercial of-ferings are generally expected to meet quality of service (QoS) requirements of customers and typically offer SLAs.

In general, cloud computing customers do not own the physical infrastructure, instead avoiding capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed, whereas oth-ers bill on a subscription basis. Sharing “perishable and intangible” computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle (which can reduce costs significantly while increas-ing the speed of application develop-ment). A side-effect of this approach is that overall computer usage rises dramatically, as customers do not have to engineer for peak load limits.

CLOUdCOMPUTIng

[fUNCTIONALITy]

Page 44: The Next Digital Decade

THE CURREnT STATE Of TWITTER

There is no doubt that Twitter has been a runaway success. Add to their rapid growth the recently announced @anywhere platform, and plans for further international expansion, and it comes as no surprise that the company is not looking to sell — at least within the next 2 years.

While the site’s growth has certainly been impressive and it has reached the point of non-displacement, there are some interesting hidden truths about Twitter and its users. The following graphic takes a look at Twitter’s path to 10 billion tweets, what we have learned about its users and what they’ve been talking about along the way.

Page 45: The Next Digital Decade

THE NEXT DIGITAL DECADE

[fUNCTIONALITy]

Page 46: The Next Digital Decade

nATURALUSERInTERfACE

Even four decades ago, Buxton could picture a future enhanced by technol-ogy. Eventually he came to dream about humans and computers having close interaction – being able to op-erate a computer by gesturing at it or by touching it, or having a computer recognize your voice and face.

Bill Buxton, a principal re-searcher for Microsoft, talks about working on new systems that allow people to work more naturally with computers.

“I’m excited more now than I have been since I’ve been in the business because I can taste it now,” says Buxton, a principal researcher at Microsoft since 2005. “Stuff I’ve been working towards and thinking about and dreaming about for 20 or 30 years is now at the threshold of general usage.”

Touch, face- and voice-recognition, movement sensors – all are part of an emerging field of computing often called natural user interface, or NUI. Interacting with technology in these humanistic ways is no longer limited to high-tech secret agents and Star Trek. Buxton says everyone can enjoy using technology in ways that are more adaptive to the person, loca-tion, task, social context and mood.

He sees a bright future in which entire “ecosystems” of devices and appliances co-exist with humans in a way that makes life better for people. Microsoft, with researchers like Bux-ton, is a leader in developing these new, more natural ways of interacting with computers. The company will showcase some of this technology at the 2010 International Consumer Electronics Show (CES) in Las Vegas this week.

Project Natal, which turns game play-ers into human controllers, is among the most high profile examples of the coming shift in technology, Buxton said. Microsoft announced at CES this week that the Xbox gaming device will be available in stores this coming holiday season.

Project Natal is the code name for an Xbox 360 add-on that incorporates face, voice, gesture, and object recognition technology to give users a variety of ways to interact with the console, all without needing a controller. It’s a “delightful” new way to spend time with friends and family playing games, watching TV shows and movies, and listening to music, says Robbie Bach, president of Microsoft’s Entertainment and Devices Division.

2.3

Page 47: The Next Digital Decade

THE NEXT DIGITAL DECADE

Bach says Project Natal and other NUI-related products will offer more natural ways to interact with video games, computers and other technology.

“For me, when people talk about touch and voice technologies, or anything related to Natural User Interface, it all comes back to what’s most natural for the users,” Bach says. “That’s why you’ll see a variety of user interfaces that are considered natural, because each one is tuned to the environment in which it operates.”

The holiday 2010 release of Project Natal will come exactly one decade after the first Xbox console hit the shelves in the holiday season of 2000.

“Natal is a next-generation experi-ence that we’re actually delivering this generation,” says Aaron Green-berg, director of product management for Xbox 360. “And they don’t even need to buy a new and controllers.

The goal of natural interfaces is not to make the keyboard and mouse obsolete, says August de los Reyes, principal director of user experience for Microsoft Surface.

Instead, NUI is meant to remove men-tal and physical barriers to technology, to make computing feel more intuitive, and to expand the palette of ways us-ers can experience technology.

Whether it’s a receptionist and pa-tient at a doctor’s office separated by a large computer monitor, or a family in a living room sitting together in silence, parents immersed in laptops and kids texting away on cell phones, technology is increasingly creating situations de los Reyes calls “con-nected but alone.”

“Technology today isolates people,” de los Reyes says. NUI and Microsoft Surface are “almost anti-technology solutions.”

A member of the Advanced Studies Program at Harvard University Gradu-ate School of Design, and a former visiting associate at Oxford, de los Reyes has become a leader in the field of finding new and intuitive ways to interact with computers.

He says natural interfaces are just the latest in a long line of evolving human-computer interaction.

In the last few decades as computers became widely used, early on there was the command line interface (CLI) with its flashing cursor calling on us-ers to type in commands. Then came the graphical user interface (GUI) and its point-and-click mouse and desktop continually developing, a number of the graphical user interface (GUI) and its point-and-click mouse and desktop with icons and windows.

[fUNCTIONALITy]

Page 48: The Next Digital Decade

Both interfaces were revolutionary in their time, and natural interfaces are the next step, de los Reyes said.

Microsoft is “absolutely leading” in the category of natural interfaces, de los Reyes says.

Microsoft has released, and is continually developing, a number of products that incorporate touch, gestures, speech, and more to make user-computer interaction more natural – more like the way humans interact with each other.

It’s a video game that a grandmother can play with her grandson, using intui-tive body movements to compete rather than having to learn to use a controller, as with Xbox’s Project Natal.

Or an in-car communications and en-tertainment system such as Microsoft Auto’s Ford SYNC that responds to a driver’s voice commands, playing favorite songs or answering text messages so drivers can keep their eyes on the road, hands on the wheel, and mind on their driving. Kia Motors also has announced this week its new UVO in-car entertainment system, which the car company developed with Microsoft.

Page 49: The Next Digital Decade

THE NEXT DIGITAL DECADE [fUNCTIONALITy]

Page 50: The Next Digital Decade
Page 51: The Next Digital Decade

THE NEXT DIGITAL DECADE

And it is a table that acts as a col-laborative massive multi-touch-screen computer, such as Microsoft Surface, or a voice-enabled Windows phone device, or a Windows 7 laptop that lets users navigate files or the Web using their fingers or a pen tool.

Though the human-computer relation-ship is becoming more personalized, it’s also becoming more personally contained. The Pixar movie “Wall-E” darkly portrayed one possible version of the future in which technology usurped even the most basic human interactions, with humans moving about in high-tech chairs that meet their every need, including commu-nication. Even if the person they are communicating with is sitting right next to them.

Though the movie’s message was a warning about technology surpassing humanism, it’s a future that’s not necessarily out of the question yet, says Buxton.

“Without informed design, [technol-ogy] is far more likely to go bad than good,” Buxton says. “It’s too impor-tant and plays too large a part of our lives to leave these things to chance. The only way it’s going to come out right is if we really work hard on un-derstanding that … it’s about people, it’s not about technology.”

Technology for the sake of technol-ogy doesn’t interest Buxton. What interests him is when technology takes a more subordinate position – the supporting actor to humankind’s starring role.

Buxton has won a number of awards and honors for his work, which ad-vocates innovation, design, and “the appropriate consideration of human values and culture in the conception, implementation, and use of new tech-nology.” He also frequently teaches and speaks on the subject, and his writings include a book and regular column for BusinessWeek magazine.

“It’s not about interface design, it’s about ‘out of your face’ design,” Buxton says. “How do I get the technology out of my face so I can focus on the stuff that’s of interest of me – the material I’m reading, the film I’m viewing, the person I’m talking to, the problem I’m trying to solve and doing so in a way that brings unexpected delight.”

But creating a more natural relationship between user and technology is not merely a matter of simply removing mice, keyboards, buttons, and knobs, or adding new input methods such as speech, touch, and in-air gestures.

“The days are over where a one-size-fits-all interface is appropriate, or even acceptable,” Buxton says.

Some technology, although advanced, is not appropriate or natural in certain situations. For example, text messaging is widely used, but driving or walking while texting is difficult – even dangerous. Speech-recognition technology works well for driving or walking, but works poorly on an airplane where privacy is important, or in a noisy, crowded restaurant.

“The trick of elegant design is making sure you do the right thing the right way for the right person at the right time. What’s right here may be wrong there,” Buxton said.

Microsoft leaders say no other company is as well situated to create new user interfaces across a range of devices and contexts. “These are all pieces of a larger puzzle that we are methodically trying to solve in this emerging field,” Buxton says.

Best of all, Buxton notes, after so many years of dreaming about the true potential of computers, “I’m still around to touch and play with it.”

[fUNCTIONALITy]

Page 52: The Next Digital Decade

3_

Page 53: The Next Digital Decade

THE NEXT DIGITAL DECADE 3_ [H

UmANITy]

Page 54: The Next Digital Decade
Page 55: The Next Digital Decade

THE NEXT DIGITAL DECADE [HUmANITy]

Page 56: The Next Digital Decade

ImAGINE A WORLD WHERE yOUR pHONE Is smART ENOUGH TO ORDER AND pAy fOR yOUR mORNING COffEE. NO mORE GIvING ORDERs, HANDING OvER yOUR pAymENT OR WAITING IN LINEs. NO mORE fACE-TO-fACE CHIT-CHAT OR HUmAN INTERACTION.

TRAnSfORMFor many, this might seem like a blessing. Who likes to wait in line? But on a grand scale, might this kind of automated world dramatically change — perhaps even eliminate — how we com-municate and connect with one another? Could it change something about us as individuals, or as a whole society?

“My short answer is yes. It’s absolutely changing society and the way people are,” says Melissa Cefkin, an ethnogra-pher at IBM. “But there’s nothing new in that. We’ve always had the introduction of new technologies that transform and move society in new ways. It changes our interactions, our sense of the world and each other.”

But if primitive hand tools changed us from gatherers to hunters, and the invention of the printing press propa-gated literacy while downgrading the importance of the oral tradition, what individual and cultural transformations do new computer technologies portend?

Researchers and technologists alike say they’re already seeing technology-wrought changes in how we operate as individuals and as a society. To be clear, they’re not finding evidence of evolu-tionary transformations -- those show up over thousands of years, not merely decades. But there have been shifts in individual and societal capabilities, habits and values. And just how these all will play out remains to be seen.

“We’re in a big social experiment. Where it ends up, I don’t know,” says Dan Siewiorek, a professor of computer science and electrical and computer engineering and director of the Human-Computer Interaction Institute at Carnegie Mellon University.

Like other researchers, Siewiorek has identified a number of areas in which individuals and societies have changed in response to technology during the past two decades. One of the most obvi-ous, he says, is the shift in how we view privacy. Having grown up in the Mc-Carthy era, Siewiorek remembers how guarded people were with their phone conversations, fearful that they would be overheard or, worse, recorded.

3.1

Page 57: The Next Digital Decade

THE NEXT DIGITAL DECADE

“TECHNOLOGy CHANGEs OUR IN-TERACTIONs, OUR sENsE Of THE WORLD AND EACH OTHER.” —mELIssA CEfKIN

WE’RE IN A BIG sOCIAL EXpERImENT. WHERE IT ENDs Up, I DON’T KNOW.

“Now you can sit in any airport and hear all the grisly details about a divorce or something like that. I don’t know if the convenience overrides privacy or people don’t care about the other people around them, but certainly what we let hang out there has changed,” he says.

Any doubts? Just look at the deeply personal details that people post on YouTube, MySpace and Facebook.

At the same time, people have used this willingness to share via technol-ogy to forge new definitions of com-munity. “There are certainly different versions of community emerging, and that’s facilitated by innovative uses of technology,” says Jennifer Earl, associate professor of sociology and director of the Center for Information Technology and Society at the Univer-sity of California, Santa Barbara.

A hundred years ago, neighbors would come together for a barn raising, willing to put in hard labor because they might need similar help someday. Today, Earl says, technology -- whether it’s Twitter or e-mails or a viral video appeal -- can spur people across the world to the same type of communal action, even if they have no personal connection to the individuals helped or the tasks involved.

“Today, with technology, we can en-able people to act collectively across boundaries. And one of the things that is different today isn’t that we can just act collectively very quickly, but we act across heterogeneous groups,” Earl says.

She points to the collective actions taken to help the victims of Hurricane Katrina as an example. Citizens throughout the U.S. posted their spare items, from clothes to extra rooms, that displaced Louisiana residents could then use in their time of need.

And it doesn’t take an emergency for new and different types of communi-ties to emerge. “Technology changes the whole idea of who we belong with,” says anthropologist Ken Ander-son, a senior researcher at Intel Corp.

In the past, community members had some sense of a shared history and shared goals and objectives. Today, an online community can have more specific, tailored interests than would likely be found in a physical neighbor-hood or town, whether it’s a rare disease, a passion for running or an interest in a celebrity.

[HUmANITy]

Page 58: The Next Digital Decade
Page 59: The Next Digital Decade

THE NEXT DIGITAL DECADE

nEW LAngUAgE nEW vALUES

Our ability to reach across time and space and build connections via technology with anyone, anywhere and at any time is changing more than our sense of community; it’s changing how we communicate, too.

“There is a new language being pro-duced, although it’s not replacing our existing language,” says anthropolo-gist Patricia Sachs Chess, founder and president of Social Solutions Inc., a consulting firm in Tempe, Ariz.

Chess and others point to the use of slang and jargon (both pre-existing and newly developed for today’s instant com-munication tools), phonics, abbreviations and colloquial syntax as the evolving standards for electronic discourse.

And this new vernacular is spilling over into traditional writing and oral exchanges. “The first thing that comes to mind is the term band-width,” Chess says. “It is a technolo-gy term and has become incorporated in the language in ways such as, ‘Do you have enough bandwidth to take on that project?’ There’s also ‘I’ll IM you’ and ‘Just text me.’ “

While we aren’t seeing those yet in formal writing, she says, they are common in casual writing such as emails and in everyday conversation.

This emerging language could pres-age even deeper changes in what we value, which skills we possess and, ultimately, what we’re capable of. For example, Gregory S. Smith, vice presi-dent and CIO at World Wildlife Fund, a Washington-based nonprofit, says he has seen the quality of writing among younger generations of work-ers decline in the past decade or so, corresponding with the rise in instant messaging, texting and Twitter.

“The advent of tools that allow for these short types of content are flooding the Internet with writing that doesn’t matter, and they’re lowering the grammatical and writing skills of our up-and-coming professionals,” says Smith, who also teaches at Johns Hopkins University.

3.2

[HUmANITy]

Page 60: The Next Digital Decade

dIgITAL nARCISSISM

Others voice deeper concerns about this evolving digital community. Go back to that example of the smart-phone ordering and paying for your morning coffee. Yes, it might eliminate waiting in long lines, but ultimately it could also affect our capacity to interact meaningfully with one another. Evan Selinger, an assistant professor in the philosophy department at the Rochester Institute of Technology, explains (ironically enough) via e-mail: “The problems posed by automation are not new, and that scenario would not present any distinctive problems, were it an isolated convenience. However, the scenario does pose a deep moral challenge because it can be under-stood as part of a growing trend in digital narcissism.”

Digital narcissism, Selinger explains, “is a term that some use to describe the self-indulgent practices that typify all-too-much user behavior on blogs and social networking sites. People often use these mediums as tools to tune out much of the external world, while reinforcing and further rational-izing overblown esteem for their own mundane opinions, tastes and lifestyle choices.”

Others point out that technology isn’t just changing our connections and how we communicate with one another. It’s also affecting our cogni-tive skills and abilities -- even what it means to be intelligent.

Researchers say the constant stimulation and ongoing demands created by technology, as we jump from texting to videoconferences to a phone call to listening to our MP3 players, seem to affect how we organize our thoughts.

Christopher R. Barber, senior vice president and CIO at Western Corporate Federal Credit Union in San Dimas, Calif., says he has noticed that some of his workers, notably the younger ones, are skilled at multi-tasking using various technologies. “And the results show that the work is getting done, and it’s getting done well,” he says.

IBM’s Cefkin confirms that we have become better able to multitask as we’ve gotten used to these technolo-gies. “But the question is, can you then fully participate in something? Scientific studies have come out on both sides,” she says.

“pEOpLE OfTEN UsE THEsE mEDI-Ums As TOOLs TO TUNE OUT mUCH Of THE EXTERNAL WORLD. —EvAN sELINGER

3.3

Page 61: The Next Digital Decade

THE NEXT DIGITAL DECADE

sOmE qUEsTION WHETHER THE CONsTANT BARRAGE Of TECHNOLOGy HAs REDUCED OUR ABILITy TO ENGAGE IN DEEp THINKING.

Winslow Burleson, an assistant professor of human-computer interac-tion at Arizona State University, says studies have found that the amount of attention many of us can devote to a single specific task is about three minutes -- 15 at the most.

Burleson doesn’t put the blame on technology alone, noting that there are multiple factors in modern life that could be contributing to this. But, he says, technology is indeed a factor. It has enabled so much multitask-ing that many people simply lack experience in focusing on one task for an extended period of time. “So some people are concerned about the ability to do thinking at a deep level,” Burleson says

.

Siewiorek says he has seen the effect of this lack of deep thinking in how people gather information. The Inter-net and the ease with which people share information via technology allows them to gather data -- whether accurate or not -- and use it without necessarily understanding its context.

“I don’t see the deep thinking. I see superficial connecting of dots rather than logical thinking,” Siewiorek says. “I see people who are not going to the source anymore. They just forward things. There’s no in-depth research going on. It seems that people have lost history. [They don’t ask] ‘Where did these things come from? Who said it first?’ “

Page 62: The Next Digital Decade

BRAIn CHAngES

There does seem to be something going on inside the brain these days, says Intel’s Anderson. Researchers are finding differences in the brains of those who grew up wired, with tests showing that the neurons in the brains of younger people fire differently than in those of older generations.

“I don’t know what that means; I don’t think anybody knows,” Anderson says.

But some question whether we even need the same thinking skills as we had in the past. After all, why should we memorize facts and figures when search engines, databases and increas-ingly powerful handheld computing devices make them instantly available?

Years ago, intelligence was measured in part by memory capacity, says Brad Al-lenby, a professor of civil, environmental and sustainable engineering at Arizona State University who studies emerging technologies and transhumanism. “Now Google makes memory a network func-tion,” he says. “Does that make memory an obsolete brain function? No, but it changes it. But how it will be changed isn’t clear. We won’t know what will hap-pen until it happens.”

Maybe so, but Allenby is already pre-dicting how we’ll measure intelligence in a world flooded with electronic technology.

3.4

Page 63: The Next Digital Decade

THE NEXT DIGITAL DECADE

BRAIn CHAngES

“Once we get seriously into [aug-mented cognition] and virtual reality, the one who has the advantage isn’t the one who is brilliant but the one who can sit in front of the computer screen and respond best,” he says. “The one who will be best is the one best integrated with the technology.”

For society, the computer is just the latest in a string of new technologies requiring adaptation, anthropologists say. In the past, we’ve embraced other technologies, including farming tools and the printing press, and we’ve managed to remain human, even if some of our skills and values have evolved.

For example, Cefkin says she used to pride herself on how many phone numbers she could remember. “Now I can’t even remember my own,” she says. “We can all point to things we used to do that we can no longer do because of technology, but we don’t talk about the things we now have. I remember where computer files are today. That’s not a different realm of memory or experience” than memo-rizing telephone numbers.

Besides, Cefkin adds, society often finds ways to fix the glitches that technologies introduce. Too many phone numbers to memorize? Cell phones with memory, or a smart card containing your important data, can replace a brain full of memorized numbers -- and do much more to boot, she adds.

In the end, Cefkin and others point out, it’s still humans who are in control of the technology and will determine whether the “advancements” technol-ogy enables will make a positive or negative impact on our world.

“Technology is another artifact in human existence that shifts us, but it does not replace thinking, it does not replace figuring out. It does not do that job for people; it can only enhance it,” says Social Solutions’ Chess. “Because human beings can do what technology will never be able to do -- and that’s form judgments.” [HUmANITy]

Page 64: The Next Digital Decade

fUTURE Of COmpUTER_ ARTIfICIAL INTELLIGENCE_

Page 65: The Next Digital Decade

THE NEXT DIGITAL DECADE

fUTURE Of COmpUTER_ ARTIfICIAL INTELLIGENCE_

Page 66: The Next Digital Decade