4
Google operates data centers at 13 sites globally, including this server farm in Hamina, Finland. Storing and processing so much data requires loads of energy and a dedicated cooling system. At Hamina, seawater from the Gulf of Finland cools the computers. Creating new kinds of computer memory could cut the demand for energy and make searching data faster. R amamoorthy Ramesh listens to Indian classical music on his smartphone, which isjammed with videos of his kids' soccer games. He streams Netflix movies on his tablet and, on his laptop, he uses Google to search the Internet several times a day. Like many of us, he's an active consumer of data in a data-centric world. But Ramesh is also a materials scientist who has a thorough understanding of what's going on under the hood of his electronic devices, and he has a lingering concern: "The computer is very advanced, but it's not close to where it should be." The problem, he says, is that today's users rely on computers that are much better at computing than at storing and recalling information. At the heart ofevery computer isaprocessor that carries out programmed instructions at blazing speeds so users can pay bills online, find a nearby Italian restaurant and post selfies on Instagram. But the processor also needs a place to store the results of its efforts for use milliseconds or years in the future. No existing memory technology can do 28 SCIENCE NEWS I October 19,2013 both: keep up with the processor and store infor- mation for long periods oftime. Modern memory devices,including random access memory (RAM), hard drives and flash,are some combination oftoo slow,too expensive and too energy-hogging. This performance gap between processor and memory has existed since the first electronic computers were introduced more than a half- century ago. But those machines weren't asked to find obscure facts on the Internet, sort through patients' medical histories and mine personal pro- fileson social networks. The amount ofdata glob- ally is expected to grow eight times larger within five years, according to IBM, and 90 percent of today's data is less than 2years old. The era of Big Data has arrived. For computers to successfully navigate through the barrage of superfluous data, Ramesh and a host ofengineers and physicists believe they need to develop a next-generation memory device. They call it storage-class memory: a single gadget that combines speed, energy efficiency and high- density storage. t.-J.-------------------------------- - - - ------

t.-J.--------------------------------classes.aligra.com/References/Memory Upgrade.pdf · since the 1940s, when the word transistor first entered the lexicon. With so much at stake,

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: t.-J.--------------------------------classes.aligra.com/References/Memory Upgrade.pdf · since the 1940s, when the word transistor first entered the lexicon. With so much at stake,

Google operates datacenters at 13 sitesglobally, including thisserver farm in Hamina,Finland. Storing andprocessing so muchdata requires loads ofenergy and a dedicatedcooling system. AtHamina, seawater fromthe Gulf of Finlandcools the computers.Creating new kindsof computer memorycould cut the demandfor energy and makesearching data faster.

R amamoorthy Ramesh listens to Indianclassical music on his smartphone,which isjammed with videos of his kids'soccer games. He streams Netflix movies

on his tablet and, on his laptop, he uses Googleto search the Internet several times a day. Likemany of us, he's an active consumer of data in adata-centric world.But Ramesh is also a materials scientist who

has a thorough understanding of what's going onunder the hood of his electronic devices, and hehas a lingering concern: "The computer is veryadvanced, but it's not close to where it should be."The problem, he says, is that today's users rely

on computers that are much better at computingthan at storing and recalling information. At theheart of every computer is a processor that carriesout programmed instructions at blazing speedsso users can pay bills online, find a nearby Italianrestaurant and post selfies on Instagram. But theprocessor also needs a place to store the resultsof its efforts for use milliseconds or years in thefuture. No existing memory technology can do

28 SCIENCE NEWS I October 19,2013

both: keep up with the processor and store infor-mation for long periods of time. Modern memorydevices, including random access memory (RAM),hard drives and flash,are some combination oftooslow, too expensive and too energy-hogging.This performance gap between processor and

memory has existed since the first electroniccomputers were introduced more than a half-century ago. But those machines weren't askedto find obscure facts on the Internet, sort throughpatients' medical histories and mine personal pro-files on social networks. The amount of data glob-ally is expected to grow eight times larger withinfive years, according to IBM, and 90 percent oftoday's data is less than 2years old. The era ofBigData has arrived.For computers to successfully navigate through

the barrage of superfluous data, Ramesh and ahost of engineers and physicists believe they needto develop a next-generation memory device.They call it storage-class memory: a single gadgetthat combines speed, energy efficiency and high-density storage.

t.-J.-------------------------------- - - - ------

Page 2: t.-J.--------------------------------classes.aligra.com/References/Memory Upgrade.pdf · since the 1940s, when the word transistor first entered the lexicon. With so much at stake,

Memory timelineElectronics use memorytechnology inventedlast century. Engineershave squeezed lotsmore performance outof these devices. but thepush is on to developgame-changing storage.

Access to storage-class memory would lead tosmarter, faster mobile devices with better batterylife. In the long run, storage-class memory couldrevolutionize the way computers work. Rameshsays, enabling processor-memory hybrids thatcompute and remember at the same time, kind oflike the human brain. It would be the first make-over of computers' fundamental architecturesince the 1940s, when the word transistor firstentered the lexicon.With so much at stake, technology industry

giants such as IBM, Samsung and Hewlett-Packard,along with innovative smaller outfits like Crossbarand Micron, are spending billions of dollars toprobe the bit-storing potential of tiny magnets,amorphous solids and miniature grids ofwire. It's acompetitive game full ofhype and tricky science, yeta steady stream of advances suggests that storage-class memory may soon catch up and meet thelofty performance standards of the processor.

"Everybody is on this like gangbusters becausetheir business is at risk," says Ramesh, whoresearches storage-class candidates at the Uni-versity of California, Berkeley. "There's definitelygoing to be a pathway to storage-class memory. Thequestion is:Which technology will take us there?"

the most expensive, interacted directly with theprocessor and stored small amounts of the mosturgent data. Information for the more distantfuture was relegated to cheaper, higher-capacitymemory devices. Bycreating this memory hierar-chy, engineers managed to keep von Neumann'sbasic architecture - memory that stores data plusinstructions for the very busy processor - intact."Today's computers would still be recognizable tovon Neumann," says Neil Gershenfeld, a physicistand computer scientist at MIT.

In modern computers the processor's mainhelper is dynamic RAM,or DRAM,a chip that pro-vides short-term, easily accessible informationstorage. Each DRAM cell consists of a capacitorthat stores electrical energy and a transistorthat serves as a swinging gate, controlling theflow of electricity to rapidly switch the capacitorbetween a charged state, which represents a 1,anduncharged, a O.DRAM, however, has an Achilles' heel: Capaci- _

tors can't hold electricity for very long. As a result,the DRAM chip requires an influx of energy 15orso times a second to refill the capacitors. Thatcontinual need for a refresh means that the com-puter has got to be on for DRAM to function. It isno good for long-term storage.Most systems use a hard disk drive for long-

term memory. These drives use mechanical armsthat write and read data onto cells on 3.5-inch-wide circular platters; the direction of magneticorientation in each cell determines whether it isa l or aO.

Hard drives are cheap and can store enormousamounts of data, but they are slow. It takes about5 milliseconds for a bit (a 1or a 0) from the proces-sor to get stored on the disk - 5 million times aslong as it takes the processor to do a calculation. Inhuman terms, that's like a restaurant patron (theprocessor) deciding what to order and a waiter(the hard drive) needing more than a month tojot the order down. Forget about getting dessert.

On a more practical level, this lag explains whymany computers take a couple of minutes to bootup when powered on: The operating system needstime to migrate from the hard drive to DRAMwhere the processor can access it.

Engineers have spent decades trying to bridgethe speed gap between processor and memory,and in 1988 computer chip giant Intel took thefirst step when it unveiled flash memory. Flashretains information when unpowered and canstore data in cells 20 nanometers wide, enoughto stockpile thousands of photos on a digital

John von Neumannpublishes memory-

IBM introducesmagnetic hard diskdrives

Same old architectureThe modern computer as we know it emerged in1945, when mathematician John von Neumannpenned his "First Draft of aReport on the EDVAC."The electronic computer he envisioned centeredon a processor that could perform hundreds ofcalculations per second. But if the processor werethe master chef, it needed recipes (instructions totell it what to do) as well as a place to store ingre-dients (data to be calculated) and keep finishedmeals (the results of the calculations) warm.Von Neumann assigned those responsibilities tomemory, which at the time came in the form ofmagnetic tape and tubes of mercury.Von Neumann's machine, which went live in

1951, took up more floor space than a thousandiPads set side by side and outweighed an Africanelephant. It had only a few kilobytes of memory,but that was plenty because the processor workedso slowly.

Things got tricky, however, once processorssped up to undertake thousands, millions and thenbillions of calculations per second. No memorydevice could both exchange data with the proces-sor billions of times a second and retain massesof information indefinitely. So engineers deviseda solution: The fastest memory, which was also

2008HP announces inventionof memristor

2012Micron Technologystarts selling phase-change memory

www.sciencenews.org IOctober 19. 2013 29

Page 3: t.-J.--------------------------------classes.aligra.com/References/Memory Upgrade.pdf · since the 1940s, when the word transistor first entered the lexicon. With so much at stake,

FEATURE I MEMORY UPGRADE

A look insideSmartphoneslikeApple'siPhone5 containshort-term (RAM)and long-term (flash)memorytohelpthe processorrunappsandstore photosandmusicfor later use.Storage-classmemorywouldincreasespeedandcapacityaswellasextendbattery life.

camera and hundreds of apps on a smartphone.It is also relatively fast (at least compared with ahard drive), so smartphones, laptops and tabletswith flash memory boot up much faster than com-puters with a mechanical hard drive.

Intel claims that it can continue making flashcheaper and faster by shrinking and stackingmemory cells. But computer scientists such asDarrell Long of the University ofCalifornia, SantaCruz say flash is nearing its performance limit.The time for a faster, cheaper and more energy-efficient replacement is now.And not just because flash could be close to

maxing out. The demands placed on computerstoday are vastly different than in the past. Com-puters with the von Neumann hierarchy excel attaking a set of data, modifying it in some way andthen placing it back into memory; data process-ing takes precedence over the actual contentsof the data. Now the bigger challenge is findingjewels and trends in vast amounts oflargely non-essential data. "Instead of crunching numbers fora bank, we're trying to find an answer to a ques-tion," says Paul Franzon, an electrical engineerat North Carolina State University in Raleigh.Computers need to be able to swiftly store andanalyze large datasets.Motivated by these factors, researchers began

about a decade ago to hunt for a type of memorythat combines the swiftness of DRAM with thecapacity and longevity of disk drives.

The right materialThe search for a breakthrough memory devicebegins in the labs of materials scientists and con-densed matter physicists. Any material used tobuild the next-generation memory device willhave to effectively distinguish between Is and Osby havingtwo distinct electrical or magnetic states.

Ferroelectric RAM, or FRAM, functions likeDRAM, with a kick. Each memory unit has acapacitor to store electricity and a transistor to

switch between 1and O.FRAM's added benefit isthat its capacitors are made from materials suchas lead zirconate titanate and bismuth ferrite,which can hold charge without needing constantrefreshes. "A decade ago, people thought it wasvery straightforward: FRAM would win the racefor storage-class memory," Ramesh says.

But FRAM has some glaring weaknesses. Whileferroelectric materials make great capacitors, theydo not integrate easily with other componentsmade of silicon. "You can't put FRAM on a sili-con wafer directly," Ramesh says, making cheapmanufacturing a challenge. Scientists are alsoconcerned about the reliability ofFRAM over thelong run, though Ramesh and colleagues recentlydeveloped a technique that allowed FRAM chipsto read and write data millions oftimes withoutany signs of degradation (SN: 7/13/13,p.ll). That'sabout 1,000 times better than flash and wouldallow most users to safely store data for decades.Hewlett-Packard claims that its next-genera-

tion memory device is 100 times faster than flashand can hold at least twice as much data. Plus itsdevice, made oftitanium dioxide, gets along justfine with silicon. The HP memristor, short formemory resistor, changes its electrical propertiesdepending on the direction of the current goingthrough it and then remembers those chargeswhen the power is off. HP made headlines in May2008 when a team led by Stanley Williams intro-duced the memristor inNature and demonstratedits potential for fast, high-capacity storage (SN:5/24/08, p.13). Still enthusiastic in 2010,Williamstold an HP publication: "We believe that the mem-ristor is a universal memory that over time couldreplace flash, DRAM and even hard drives."HP won't say when it will bring the memristor

to market, but it could happen as early as nextyear. Ramesh says HP has to address concernsabout the device's long-term reliability. Mean-while, in August a small company called Cross-bar, in Santa Clara, Calif., announced that it has

Closing the gap Computerperformancehasskyrocketedoverthe decades,yet memoryisstilla longwayfromkeepingupwiththe processor,whichcanrunthroughbillionsof ls andOspersecond.TheprocessorreliesonDRAMandexpensivelow-capacitydevicescalledcachesto store the mosturgent data.Everythingelsegets passedon to long-termstorage suchas flashand harddiskdrives,but these workat a fractionofthe processor'sspeed.Researcherswant to create storage-classmemorythat combinesthe cost androbustnessof harddiskswiththe speedofDRAM.SOURCL GEOFFREY W. BURR

Flowof data

1

Short-termmemory~---------- Speed gap-----------1 Long-termmemory

10' 10' 107

Accesstime(nanoseconds)10 103

30 SCIENCENEWSI October19,2013

10·

Page 4: t.-J.--------------------------------classes.aligra.com/References/Memory Upgrade.pdf · since the 1940s, when the word transistor first entered the lexicon. With so much at stake,

developed a similar type of fast memory calledresistive RAM. The company claims to have pro-duced a commercially viable postage stamp-sizedchip that can hold a terabyte of data (that's 1012

bytes), though it hasn't announced when its prod-uct will be available for sale.

Phase-change memory is already finding itsway into electronics. The device is made of a com-pound of germanium, antimony and telluriumwith electric properties that change dependingon its temperature: It can behave as a normal solidor an amorphous, flowing substance. The idea isto melt or solidify the compound depending onwhether the memory is storing a 1or a O.Thoughsome researchers worry that the device requirestoo much energy to repeatedly heat and melt thecompound, Micron Technology, headquarteredin Boise, Idaho, began selling rudimentary phase-change memory for basic cell phones last year.

There are other storage-class memory tech-nologies in play too. Samsung is working onspin-transfer torque RAM, which uses electriccurrents to shift the magnetic orientation of a thinlayer of material. And IBM is exploring racetrackmemory, which relies on current darting througha tiny grid of wire to manipulate even smallermagnetic cells that can switch between 1and O.

All of these memory upgrades face technicalchallenges, but there are economic ones as well.Manufacturers have churned out hard drives,flash and DRAM for years, and they won't rush toadopt a risky technology. "Modern semiconduc-tor development is extremely expensive," saysGeoffrey Burr, who studies storage-class memoryat IBM's Almaden Research Center in San Jose,Calif. Companies will invest, he says, only if it'salmost definite that the technology would workas expected and sell in large numbers.

Smarter devicesThe road to creating storage-class memory hasbeen bumpier than most researchers expected,but their eyes are still on the prize. They knowthat when storage-class memory finally makesit to market, life will change for consumers andbusinesses.

"You could have a terabyte of memory in yourmobile device," Franzon says, or 30 to 60 timesas much storage as most current smartphones."It would dramatically change the user experi-ence." For example, he says, people could storethousands of movies on their phones rather thanhaving to stream them online.

Better storage-class memory is about much

more than upgrading srnartphones, however.Tech giants like Google and Facebook operatevast data centers that use plodding hard drivesand power- hogging DRAMchips to store and ana-lyze petabytes - more than amillion billion Is andOs- of search terms, likes and relationship sta-tuses. The energy costs for these mammoth facili-ties are steep; they require their own power plantsand cooling facilities to keep them humming. In2010, Google's servers used 2.3 million megawatt-hours of energy, enough to supply 200,000 homesfor a year. Replacing hard drives and DRAMwithstorage-class memory would speed up servers andslash their energy

30 60needs.Plenty of other

BigData users wouldbenefit as well. Doc-tors could quicklysift through medicalrecords and studiesto diagnose patientsand prescribe treat-ments. Scientists could look for patterns ingenetic sequences as well as astronomical images.(The Large Synoptic Survey Telescope in Chile,scheduled to begin scanning the skies within adecade, is expected to produce 30 terabytes ofdata - equivalent to about 4 million high-qual-ity photographs - per night.) And governmentdefense agencies would surely love a computerthat rapidly ferrets out terrorist networks andidentifies threatening messages.Some computer scientists say that the real

fun will begin once a storage-class memorydevice -whether souped-up flash, mernristors,phase-change or an untapped mystery material-rises above its competitors. Ramesh contrasts thehierarchical approach of present-day computers,which masks the shortcomings of memory, withthe complex multitasking that takes place in thehuman brain. If engineers can finally build mem-ory that works in tandem with the processor, hesays, then they can think about creating devicesthat compute and recall at the same time. Sucha seismic shift would further optimize comput-ers to do the jobs we ask of them and, perhapsfinally, lead to a machine that even avisionary like

times more storageWhat 1 terabyte of memory would mean

for smartphone users

von Neumann wouldn't recognize. _

Explore more• John L.Hennessy and DavidA.Patterson.

Computer Architecture. Elsevier,2012.bit.lyl1bpwe06

www.sciencenews.org IOctober 19, 2013 31