21st Century Computer Architecture A community white paper 21st Century Computer Architecture A community white paper NSF Outbrief on June 22, 2012 Information

Embed Size (px)

Citation preview

  • Slide 1
  • 21st Century Computer Architecture A community white paper 21st Century Computer Architecture A community white paper NSF Outbrief on June 22, 2012 Information & Commun. Techs Impact Semiconductor Technologys Challenges Computer Architectures Future Pre-Competitive Research Justified Process, Participants & Backups
  • Slide 2
  • 20 th Century ICT Set Up Information & Communication Technology (ICT) Has Changed Our World o Required innovations in algorithms, applications, programming languages, , & system software Key (invisible) enablers (cost-)performance gains o Semiconductor technology (Moores Law) o Computer architecture (~80x per Danowitz et al.) 2
  • Slide 3
  • Enablers: Technology + Architecture 3 Danowitz et al., CACM 04/2012, Figure 1 Technology Architecture
  • Slide 4
  • 21 st Century Promise ICT Promises Much More o Data-centric personalized health care o Computation-driven scientific discovery o Human network analysis o Much more: known & unknown Characterized by o Big Data o Always Online o Secure/Private oo Whither enablers of future (cost-)performance gains? 4
  • Slide 5
  • Technologys Challenges 1/2 Late 20 th CenturyThe New Reality Moores Law 2 transistors/chip Transistor count still 2 BUT Dennard Scaling ~constant power/chip Gone. Cant repeatedly double power/chip 5
  • Slide 6
  • Classic CMOS Dennard Scaling: the Science behind Moores Law 6 National Research Council (NRC) Computer Science and Telecommunications Board (CSTB.org) Scaling: Oxide: t OX / Results: Power Density: Voltage: V/ Power/ckt: 1/ 2 ~Constant (Finding 2) Source: Future of Computing Performance: Game Over or Next Level?, National Academy Press, 2011
  • Slide 7
  • Power Density: ~Constant Post-classic CMOS Dennard Scaling 7 National Research Council (NRC) Computer Science and Telecommunications Board (CSTB.org) Scaling: Oxide: t OX / Results: Voltage: V/ V Power/ckt: 1 22 1/ 2 Post Dennard CMOS Scaling Rule TODO: Chips w/ higher power (no), smaller ( ), dark silicon ( ), or other (?)
  • Slide 8
  • Technologys Challenges 2/2 Late 20 th CenturyThe New Reality Moores Law 2 transistors/chip Transistor count still 2 BUT Dennard Scaling ~constant power/chip Gone. Cant repeatedly double power/chip Modest (hidden) transistor unreliability Increasing transistor unreliability cant be hidden Focus on computation over communication Communication (energy) more expensive than computation 1-time costs amortized via mass market One-time cost much worse & want specialized platforms 8 How should architects step up as technology falters?
  • Slide 9
  • 21 st Century Comp Architecture 20 th Century21 st Century Single-chip in generic computer Architecture as Infrastructure: Spanning sensors to clouds Performance plus security, privacy, availability, programmability, Cross- Cutting: Break current layers with new interfaces Performance via invisible instr.-level parallelism Energy First Parallelism Specialization Cross-layer design Predictable technologies: CMOS, DRAM, & disks New technologies (non-volatile memory, near-threshold, 3D, photonics, ) Rethink: memory & storage, reliability, communication 9 X X
  • Slide 10
  • 21 st Century Comp Architecture 20 th Century21 st Century Single-chip in stand-alone computer Architecture as Infrastructure: Spanning sensors to clouds Performance plus security, privacy, availability, programmability, Cross- Cutting: Break current layers with new interfaces Performance via invisible instr.-level parallelism Energy First Parallelism Specialization Cross-layer design Predictable technologies: CMOS, DRAM, & disks New technologies (non-volatile memory, near-threshold, 3D, photonics, ) Rethink: memory & storage, reliability, communication 10
  • Slide 11
  • What Research Exactly? Example o Dream of sensor/phone apps that burn >> 1W package o Note: Short-term use often followed by long idle period o Exploit Computational Sprinting (to idle!) 16 cores (32W) for 100s ms in 1W package Use phase-change material thermal capacitor Research areas in white paper (& backup slides) 1.Architecture as Infrastructure: Spanning Sensors to Clouds 2.Energy First 3.Technology Impacts on Architecture 4.Cross-Cutting Issues & Interfaces Computation Sprinting funded NSF CCF-1161505 Much more research developed by future PIs! 11
  • Slide 12
  • Validating Thermal Models 12 Source: Computational Sprinting (talk) NSF CCF-1161505 Raghavan, Luo, Chandawalla, Papaefthymiou, Pipe, Wenisch & Martin HPCA 2012 ( http://www.cis.upenn.edu/acg/papers/DaSi-talk.pptx )
  • Slide 13
  • Pre-Competitive Research Justified Retain (cost-)performance enabler to ICT revolution http://cra.org/ccc/docs/init/21stcenturyarchitecturewhitepaper.pdf Successful companies cannot do this by themselves o Lack needed long-term focus o Dont want to pay for what benefits all o Resist transcending interfaces that define their products 13
  • Slide 14
  • White Paper Process Late March 2012 o CCC contacts coordinator & forms group April 2012 o Brainstorm (meetings/online doc) o Read related docs (PCAST, NRC Game Over, ACAR1/2, ) o Use online doc for intro & outline then parallel sections o Rotated authors to revise sections May 2012 o Brainstorm list of researcher in/out of comp. architecture o Solicit researcher feedback/endorsement o Do distributed revision & redo of intro o Release May 25 to CCC & via email Kudos to participants on executing on a tight timetable 14
  • Slide 15
  • White Paper Participants * contributed prose; ** effort coordinator Thanks of CCC, Erwin Gianchandani & Ed Lazowska for guidance and Jim Larus & Jeannette Wing for feedback 15 Sarita Adve, U Illinois * David H. Albonesi, Cornell U David Brooks, Harvard U Luis Ceze, U Washington * Sandhya Dwarkadas, U Rochester Joel Emer, Intel/MIT Babak Falsafi, EPFL Antonio Gonzalez, Intel/UPC Mark D. Hill, U Wisconsin *,** Mary Jane Irwin, Penn State U * David Kaeli, Northeastern U * Stephen W. Keckler, NVIDIA/U Texas Christos Kozyrakis, Stanford U Alvin Lebeck, Duke U Milo Martin, U Pennsylvania Jos F. Martnez, Cornell U Margaret Martonosi, Princeton U * Kunle Olukotun, Stanford U Mark Oskin, U Washington Li-Shiuan Peh, M.I.T. Milos Prvulovic, Georgia Tech Steven K. Reinhardt, AMD Michael Schulte, AMD/U Wisconsin Simha Sethumadhavan, Columbia U Guri Sohi, U Wisconsin Daniel Sorin, Duke U Josep Torrellas, U Illinois * Thomas F. Wenisch, U Michigan * David Wood, U Wisconsin * Katherine Yelick, UC Berkeley/LBNL *
  • Slide 16
  • Back Up Slides Detailed research areas in white paper 1.Architecture as Infrastructure: Spanning Sensors to Clouds 2.Energy First 3.Technology Impacts on Architecture 4.Cross-Cutting Issues & Interfaces http://cra.org/ccc/docs/init/21stcenturyarchitecturewhitepaper.pdf Findings on National Academy Game Over Study Glimpse at DARPA/ISAT Workshop Advancing Computer Systems without Technology Progress 16
  • Slide 17
  • 1. Architecture as Infrastructure: Spanning Sensors to Clouds Beyond a chip in a generic computer To pillar of 21 st century societal infrastructure. o Computation in context (sensor, mobile, , data center) o Systems often large & distributed o Communication issues can dominate computation o Goals beyond performance (battery life, form factor) Opportunities (not exhaustive) o Reliable sensors harvesting (intermittent) energy o Smart phones to Star Treks medical tricorder o Cloud infrastructure suitable for both Big Data streams & low-latency qualify-of-service with stragglers o Analysis & design tools that scale 17
  • Slide 18
  • 2. Energy First Beyond single-core performance computer To (cost-)performance per watt/joule Energy across the layers o Circuit/technology (near-threshold CMOS, 3D stacking) o Architecture (reducing unnecessary data movement) o Software (communication-reducing algorithms) Parallelism to save energy o Vast (fined-grained) homogeneous & heterogeneous o Improved SW stack o Applications focus (beyond graphic processing units) Specialization for performance & energy efficiency o Abstractions for specialization (reducing 1-time cost) o Energy-efficient memory hierarchies o Reconfigurable logic structures 18
  • Slide 19
  • 3. Technology Impacts on Architecture Beyond CMOS, Dram, & Disks of last 3+ decades to Using replacement circuit technologies o Sub/near-threshold CMOS, QWFETs, TFETs, and QCAs Non-volatile storage o Beyond flash memory to STT-RAM, PCRAM, & memristor 3D die stacking & interposers o logic, cache, small main memory Photonic interconnects o Inter- & even intra-chip Design automation o from circuit-design w/ new technologies to o pre-RTL functional, performance, power, area modeling of heterogeneous chips & systems 19
  • Slide 20
  • 4. Cross-Cutting Issues & Interfaces Beyond performance w/ stable interfaces to New design goals (for pillar of societal infrastructure) o Verifiability (bugs kill) o Reliability (dependability computing base?) o Security/Privacy (w/ non-volatile memory?) o Programmability (time to correct-performant solution) Better Interfaces o High-level information (quality of service, provenance) o Parallelism ((in)dependence, (lack of) side-effects) o Orchestrating communication ((recursive) locality) o Security/Reliability (fine-grain protection) 20
  • Slide 21
  • Executive summary (Added to National Academy Slides) Highlights of National Academy Findings (F1) Computer hardware has transitioned to multicore (F2) Dennard scaling of CMOS has broken down (F3) Parallelism and locality must be exploited by software (F4) Chip power will soon limit multicore scaling Eight recommendations from algorithms to education We know all of this at some level, BUT: Are we all acting on this knowledge or hoping for business as usual? Thinking beyond next paper to where future value will be created? Questions Asked but Not Answered Embedded in NA Talk Briefly Close with Kbler-Ross Stages of Grief: Denial Acceptance Source: Future of Computing Performance: Game Over or Next Level?, National Academy Press, 2011 Mark Hill talk ( http://www.cs.wisc.edu/~markhill/NRCgameover_wisconsin_2011_05.pptx )
  • Slide 22
  • The Graph 22 System Capability (log) 80s90s00s10s20s30s40s CMOS Fallow Period New Technology Our Focus 50s Source: Advancing Computer Systems without Technology Progress, ISAT Outbrief (http://www.cs.wisc.edu/~markhill/papers/isat2012_ACSWTP.pdf) Mark D. Hill and Christos Kozyrakis, DARPA/ISAT Workshop, March 26-27, 2012. Approved for Public Release, Distribution Unlimited The views expressed are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government.
  • Slide 23
  • Surprise 1 of 2 Can Harvest in the Fallow Period! 2 decades of Moores Law-like perf./energy gains Wring out inefficiencies used to harvest Moores Law HW/SW Specialization/Co-design (3-100x) Reduce SW Bloat (2-1000x) Approximate Computing (2-500x) --------------------------------------------------- ~1000x = 2 decades of Moores Law! 23
  • Slide 24
  • Surprise 2 of 2 Systems must exploit LOCALITY-AWARE parallelism Parallelism Necessary, but not Sufficient As communications energy costs dominate Shouldnt be a surprise, but many are in denial Both surprises hard, requiring vertical cut thru SW/HW 24