Upload
ardara
View
28
Download
1
Embed Size (px)
DESCRIPTION
Panasas Update on CAE Strategy. Stan Posey Director, Vertical Marketing and Business Development [email protected]. Panasas Company Overview. Company: Silicon Valley-based; private venture-backed; 150 people WW Technology: Parallel file system and storage appliances for HPC clusters - PowerPoint PPT Presentation
Citation preview
www.panasas.com
Panasas Updateon CAE Strategy
Stan PoseyDirector, Vertical Marketing and Business [email protected]
2 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Company: Silicon Valley-based; private venture-backed; 150 people WW
Technology: Parallel file system and storage appliances for HPC clusters
History: Founded 1999 by CMU Prof. Garth Gibson, co-inventor of RAID
Alliances: ISVs; Dell and SGI now resell Panasas; Microsoft and WCCS
Extensive recognition and awards for HPC breakthroughs:
Six Panasas Customers won Awards at SC07 Conference:
Panasas parallel I/O and Storage Enabling Petascale Computing
Storage system selected for LANL’s $110MM hybrid “Roadrunner”- Petaflop IBM system with 16,000 AMD cpus + 16,000 IBM cells, 4x over LLNL BG/L
Panasas CTO Gibson leads SciDAC’s Petascale Data Storage InstitutePanasas and Prof. Gibson primary contributors to pNFS development
Panasas Company Overview
3 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Gov
Energy
Mfg
HER
Other
17%
32%
36%
12%
4%
Panasas Top 3: Energy – 36%;
Gov – 32%; Mfg – 17%
Panasas Business Splits by About 1/3 Government and 2/3 Industry
Source: Panasas internal, distribution of installed shelves by customer by vertical
Sample Top Customers:
Customer Vertical No. Shelves
LANL Gov 275
LLNL/SNL Gov 45
TGS Energy 75
PGS Energy 194
BP Energy 90
Boeing Mfg 117
NGC Mfg 20
Intel Mfg 98
LSU HER 17
UNO-PKI HER 11
Panasas Customers by Vertical Market
4 IDC HPC User Forum, Norfolk, VA 14 Apr 08
IDC: Linux Clusters 66% of HPC Market
Clusters Create the Opportunity for Panasas
Direct Attached
Storage (DAS)
Network Attached
Storage (NAS)
Parallel NAS(Panasas)
5 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Cluster Computing and I/O Bottlenecks
Clusters = Parallel Compute Parallel Compute needs Parallel IO
Conventional Conventional Storage Storage
(NFS servers)(NFS servers)
Linux Linux Compute Compute ClusterCluster
Single data Single data path to path to storagestorage
IssuesComplex ScalingLimited BW & I/OIslands of storageInflexibleExpensive
Linear ScalingExtreme BW & I/OSingle storage poolEase of MgmtLower Cost
Parallel Parallel data data
pathspaths
Panasas Panasas Parallel Parallel StorageStorage
Benefits
Linux Linux Compute Compute
Cluster Cluster (i.e. (i.e. MPI apps)MPI apps)
6 IDC HPC User Forum, Norfolk, VA 14 Apr 08
ING Renault F1CFD Centre, Enstone, UK
CAE Software
CFD (STAR-CD; STAR-CCM+); Pre/Post (ANSA, FIELDVIEW); Optimization (iSIGHT)
HPC Solution
Linux cluster (~3000 cores); PanFS 180TB file system
Requirement
Technology changed from NetApp NAS so that data could move to parallel I/O scheme and parallel storage
Parallel storage evaluations: Panasas; Isilon IQ6000; NetApp Ontap GX FAS 6070 and 3050
Business Value
CFD provided 10% - 20% of aerodynamics gain, goal is to double those gains in 2008
Key design objectives: maximize down-force; improve aero efficiency by 9%; optimize handling characteristics
Panasas: 12 Shelves, 180 TB
Linux x86_64, 786 nodes, 3144 cores
ING Renault F1 Aerodynamics and Panasas
7 IDC HPC User Forum, Norfolk, VA 14 Apr 08
ING Renault F1CFD Centre, Enstone, UK
Renault F1 CFD Centre Tops Other F1Renault F1 – CFD Centre (Jul 08)
38 TFLOPS; 6.4 TB Memory
Panasas 180 TB file system
BMW Sauber Albert2 (Dec 06)
12.3 TFLOP; 2TB Memory
Quadrics HPC Networking
Panasas 50 TB file system
100 million cell problems
RedBull (Oct 06)
6 TFlops; 128 servers / 500 cores
Renault F1 (Jan 07)
1.8 TFLOP; <1TB
25 million cells problems
Panasas: 12 Shelves, 180 TB
Linux x86_64, 786 nodes, 3144 cores
ING Renault F1 Aerodynamics and Panasas
Panasas:
Now the file system of choice for the two largest F1 clusters
8 IDC HPC User Forum, Norfolk, VA 14 Apr 08
CD-adapcoCAE Consulting Center, Plymouth MI
CAE Software
CFD - STAR-CD, STAR-CCM+; CSM – Abaqus
HPC Solution
Linux cluster (~256 cores); PanFS 30TB file system
Business Value
File reads and merge operations 2x faster than NAS
Parallel I/O in STAR-CD 3.26 can leverage PanFS parallel file system today
Parallel I/O under development for v4.06 (Q2 08)
Plans for parallel I/O for STAR-CCM+ “sim” file (Q4 08)
Panasas: 3 Shelves, 30 TB
Linux x86_64, 256 cores
New CD-adapco Cluster and Panasas Storage
9 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Honeywell Aerospace and Panasas Storage
Honeywell AerospaceTurbomachinery, Locations in US
Profile
Use of HPC for design of small gas turbine engines and engine components for GE, Rolls Royce, and others
Challenge
Deploy CAE simulation software for improvements in aerodynamic efficiency, noise reductions, combustion, etc.
Provide HPC cluster environment to support distributed users for CFD – FLUENT, CFX; CSM – ANSYS, LS-DYNA
HPC Solution
Linux clusters (~452 cores) total, Panasas on latest 256
Panasas parallel file system, 5 storage systems, 50 TBs
Business Value
CAE scalability with Panasas allows improved LES simulation turn-around for combustors
Enables efficiency improvements, reduction of tests
Panasas: 5 Shelves, 50 TB
Linux x86_64, 256 cores
10 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Boeing CompanyCAG & IDS, Locations in USA
Profile
Use of HPC for design of commercial aircraft , space and communication and defense weapons systems
Challenge
Deploy CAE simulation software for improvements in aerodynamic performance, reductions in noise, etc.
Provide HPC cluster environment to support 1000’s of users for CFD (Overflow; CFD++), CSM (MSC.Nastran; Abaqus; LS-DYNA), and CEM (CARLOS)
HPC Solution
8 x Linux clusters (~3600 cores); 2 x Cray X1 (512 cores)
Panasas PanFS, 112 storage systems, > 900 TBs
Business Value
CAE scalability allows rapid simulation turn-around, and enables Boeing to use HPC for reduction of expensive tests
Panasas 116 Shelves, 900 TB
8 x Linux x86_64 2 x Cray X1
NFS
Boeing HPC Based on Panasas Storage
11 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Boeing HPC Awards at SC06 and SC07
Aerodynamics
Structures
Propulsion
Electromagnetics
Acoustics
Transient CFD
Aeroelasticity
Larger scale acoustics
Existing Technology :
Newer Technology :
Announced 12 Nov 07: 2007 Reader’s Choice Recipient
12 IDC HPC User Forum, Norfolk, VA 14 Apr 08
CAE Workflow Bottlenecks: I/O related to end-user collaboration-intensive tasks:
• Long times in movement of model domain partitions to nodes
• Post-processing of large files owing to their network transfer
• Case and data management (movement) of CAE simulation results
CAE Workload Bottlenecks : I/O related to parallel cluster compute-intensive tasks:
• Thru-put of “mixed-disciplines” competing for same I/O resource
• Transient CFD (LES, etc.) with increased data-save frequency
• Large-DOF CSM implicit with out-of-core I/O requirements
• MM-element CSM explicit with 1000’s of data-saves
• Non-deterministic modeling automation and parameterization
• General application of multi-scale, multi-discipline, multi-physics
CAE Productivity Challenges are Growing
13 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Input (mesh, conditions)
start
.
.
.
.
.
.
Results(pressures, . . .)
iter ~3000complete
Computational Schematic of a CFD Simulation
Steady State CFD: I/O is Manageable
Steady State
14 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Input (mesh, conditions)
start
.
.
.
.
.
.
Results(pressures, . . .)
Steady State
iter ~2500complete
start
.
.
.
.
.
.
iter ~10,000complete
Unsteady
Computational Schematic of a CFD Simulation
Unsteady: More Computation Steps . . .
15 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Input (mesh, conditions)
start
.
.
.
.
.
.
Results(pressures, . . .)
Steady State
iter ~2500complete
start
.
.
.
.
.
.
iter ~10,000complete
Unsteady
Time history
time step = 500
Time history
time step = 25
Time history
time step = 20
Time history
time step = 15
Time history
time step = 10
Time history
time step = 5
Computational Schematic of a CFD Simulation
. . . But 100x I/O and Case for Parallel I/O
16 IDC HPC User Forum, Norfolk, VA 14 Apr 08
More Parallel CFD Means I/O Must Scale
2008: HPC Clusters
1998: Desktops
2003: SMP Servers
single thread compute-bound
16-way
64-way I/O-bound and growing bottleneck
I/O significant
17 IDC HPC User Forum, Norfolk, VA 14 Apr 08
ISV: Software
ANSYS: FLUENT
CD-adapco: STAR-CD
CD-adapco: STAR-CCM+
Metacomp: CFD++
Acusim: AcuSolve
ANSYS: CFX
Exa:PowerFLOW
LSTC: LS-DYNA
ABAQUS: ABAQUS/Explicit
ESI: PAM-CRASH
Altair: RADIOSS
ANSYS: ANSYS
MSC.Software: MD Nastran
ABAQUS: ABAQUS/Standard
Discipline
CFD
CSM Explicit
- Impact
CSM Implicit
- Structures
2008 2009 2010 Parallel I/O Status
24 – 48 32 – 64 64 – 96 Y – v12 in q3 08
24 – 48 32 – 64 64 – 96 Y – v3.26, v4.2 in q2 08
32 – 64 48 – 64 64 – 96 Y – v2.x in q4 08
32 – 64 48 – 96 64 – 128 Y – v6.x in q1 08
32 – 64 48 – 96 64 – 128 Y – v5.x, need to test
24 – 48 32 – 56 48 – 64 N – plans to follow FLUENT
32 – 64 48 – 72 64 – 96 N – no plans announced
32 – 64 48 – 64 64 – 96 N – no plans announced
08 – 16 12 – 24 16 – 32 N – no plans announced
32 – 64 48 – 56 48 – 72 N – no plans announced
24 – 32 32 – 48 42 – 64 N – no plans announced
04 – 06 04 – 08 06 – 12 Y & N – scratch, not results
04 – 06 04 – 06 06 – 10 Y & N – scratch, not results
04 – 06 06 – 12 08 – 24 Y & N – scratch, not results
Scalability in Practice (cores)
CAE Solver and I/O Scalability Status
18 IDC HPC User Forum, Norfolk, VA 14 Apr 08
ISV: Software
ANSYS: FLUENT
CD-adapco: STAR-CD
LSTC: LS-DYNA
ABAQUS: ABAQUS
ANSYS: ANSYS
MSC.Software: MD Nastran
Metacomp: CFD++
ANSYS: ANSYS CFX
ESI: PAM-CRASH
Exa: PowerFLOW
AcuSim: AcuSolve
Altair: OptiStruct
CEI: Ensight
IL: FIELDVIEW
Panasas Progress
PanFS certified for v12 early 08, Panasas system installed
PanFS for v3.26 today, STAR-CCM+ 08, 3 systems installed
Automotive benchmarks completed for explicit, benefits in d3plot save, need aero benchmarks and more implicit testing; Panasas system installed, Panasas engineering giving guidance
Key web benchmarks completed for explicit, benefit in 20% range shown, working now on impicit; Panasas system installed, Panasas engineering giving guidance
Panasas system installed during Q3 07, testing has begun
Initial discussions completed, progress made during Q3 07
System installed, parallel I/O project begins during early 08
Joint review Q207, leverage FLUENT project, system installed
Initial discussions began 25 Jan 07, review in progress
Exa confirmed I/O bottlenecks a customer issue, no plans yet
ISV believes PanFS leverage today, must test for parallel I/O
Working with ISV technical team to implement read-backward
Confirmed on alliance, testing to begin during Q3 07
Customer-driven PanFS support for distributed post-processing
Panasas Investments in CAE Alliances
Progress as of 01 Apr 08
19 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Parallel
Truly parallel and improved serial I/O
Improved scalability
Partitioning 1 billion cell cases
512 cores Serial I/O Parallel I/O
Read 1708s 157s
Write 4255s 335s
750 million cell FLUENT 12 case
(80GB pdat file) Intel IB cluster 512 cores, Panasas FS
11 x13 x
ANSYS CFD 12.0: Core Area Advances (I)
Source: Dr. Dipankar Choudhury Technical Keynote of the European
Automotive CFD Conference, 05 July 2007, Frankfurt Germany
FLUENT 12 for 750M Cell Model
Results Produced By Ansys on
Cluster at the Intel Data Center
20 IDC HPC User Forum, Norfolk, VA 14 Apr 08
Standards-based Core Technologies with HPC Productivity Focus
Scalable I/O and storage solutions for HPC computation and collaboration
Investments in ISV Alliances and HPC Applications Development
Joint development on performance and improved application capabilities
Established and Growing Industry Influence and Advancement
Valued contributions to customers, industry, and research organizations
HPC Technology | ISV Alliances | Industry Advancement
Panasas HPC Focus and Vision
www.panasas.com
Thank you for this opportunity
Q & AFor more information,call Panasas at:
1-888-PANASAS (US & Canada)
00 (800) PANASAS2(UK & France)
00 (800) 787-702(Italy)
+001 (510) 608-7790(All Other Countries)
Stan PoseyDirector, Vertical Marketing and Business [email protected]