41
Intermezzo 1: Some Issues in HPC Automatic approaches Intermezzo 2: Some Issues in HPC Resilience Conclusion HPC and Interval Computations Nathalie Revol INRIA - University of Lyon SCAN 2016 Uppsala, Sweden September 26-29, 2016 Nathalie Revol HPC and Interval Computations

HPC and Interval Computations · Conclusion HPC and Interval Computations Nathalie Revol ... (cf. MPFI library): ... I linear system solving within a factor 15 compared to MatLab

  • Upload
    lamkiet

  • View
    229

  • Download
    0

Embed Size (px)

Citation preview

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

HPC and Interval Computations

Nathalie RevolINRIA - University of Lyon

SCAN 2016Uppsala, Sweden

September 26-29, 2016

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Some issues in HPCcf. Dongarra, and Gustafson, SCAN 2014

One big issue in developping HPC application:moving data takes tremendously longer than computing withthem.

Recently revised version of the figures can be found e.g. in theslides of Dongarra (HPC Days, Lyon, April 2016).

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Critical Issues at Peta & Exascale for Algorithm and Software Design •  Synchronization-reducing algorithms

§  Break Fork-Join model

•  Communication-reducing algorithms §  Use methods which have lower bound on communication

•  Mixed precision methods §  2x speed of ops and 2x speed for data movement

•  Autotuning §  Today’s machines are too complicated, build “smarts” into

software to adapt to the hardware

•  Fault resilient algorithms §  Implement algorithms that can recover from failures/bit flips

•  Reproducibility of results §  Today we can’t guarantee this. We understand the issues,

but some of our “colleagues” have a hard time with this. Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

What to take home?

About reproducibility: real need for reproducible results or ratherfor bounds on the results:if r1 2 r1 and r2 2 r2 and r1 \ r2 6= ;, then are r1 and r2 equallyacceptable?

There is time to do more computations on the databut not too much time: not more than 5 to 10 times (never morethan 20) longer than the original computation.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

What to take home?

About reproducibility: real need for reproducible results or ratherfor bounds on the results:if r1 2 r1 and r2 2 r2 and r1 \ r2 6= ;, then are r1 and r2 equallyacceptable?

There is time to do more computations on the databut not too much time: not more than 5 to 10 times (never morethan 20) longer than the original computation.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

What to take home?

About reproducibility: real need for reproducible results or ratherfor bounds on the results:if r1 2 r1 and r2 2 r2 and r1 \ r2 6= ;, then are r1 and r2 equallyacceptable?

There is time to do more computations on the databut not too much time: not more than 5 to 10 times (never morethan 20) longer than the original computation.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Agenda

Intermezzo 1: Some Issues in HPC

Automatic approaches

Intermezzo 2: Some Issues in HPC

ResilienceIntroductionMatrix productConjugate Gradient

Conclusion

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

In Kahan’s opinion:How Futile are Mindless Assessments of Roundo↵ in Floating-Point

Computation?, 2006

Several schemes have been advocated as substitutes for or aids toerror analysis by non-experts. None can be trusted fully if used asadvertised, which is usually Mindless, i.e. without a penetratinganalysis of the program in question. [...] The several Mindlessschemes in question include Interval Arithmetic, and recomputationwith increasing precision, or with redirected rounding, or withrandomized rounding, or with randomly perturbed input data.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

In Kahan’s opinion:How Futile are Mindless Assessments of Roundo↵. . .

Can the e↵ects of roundo↵ upon a floating-point computation beassessed without submitting it to a mathematically rigorous and (iffeasible at all) time consuming error-analysis ? In general, No.Interval Arithmetic approximates every variable by an intervalwhose ends straddle the variables true value. Used naively, thisscheme is cursed by excessively wide intervals that undermine itscredibility when wide intervals are deserved. Swollen intervals canoften be curbed by combining Interval Arithmetic with ordinarilyrounded arithmetic in a computation artfully recast as thedetermination of the fixed-point of a su�ciently contractivemapping. Artful is far from Mindless. Far less art may coax successfrom extendable-precision Interval Arithmetic, though its price maybe high and its performance slow.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Naıve use of interval arithmetic

with usual precision (floating-point arithmetic available on theprocessor):cf. Kahan’s warning: it may work. . . or it may not.

with arbitrary precision (cf. MPFI library):it will eventually work. . .but not within an overhead factor 5 to 10.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Other representations of intervals

Use of mid-rad representation: better suited for this purpose, asthe midpoint corresponds to the floating-point value and the radiusaccounts for roundo↵ errors.

with usual precision (floating-point arithmetic available on theprocessor), cf. IntLab library:e�cient, often does the job.

with arbitrary precision (cf. LILIB library):for the midpoint and much less precision for the radius.Talk LILIB Long Interval Library by Nozomu Matsuda and NobitoYamamoto (session G1, Thursday 10h55).

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Other representations of intervals

Regarding mid-rad representation:

I matrix product within a factor 3 compared to MKL, on amulticore(PhD thesis of Philippe Theveny, 2014);

I linear system solving within a factor 15 compared to MatLab(PhD thesis of Hong Diep Nguyen, 2011).

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Other representations of intervals

A�ne arithmetic (cf. Masahide Kashiwagi’s talk on Mondayafternoon).

Polynomial models (cf. Hiroshi Kokubu’s talk on Mondaymorning).

But

I computed results may not correspond to roundo↵ errors,

I execution time may not remain within an overhead factor 5 to10.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Agenda

Intermezzo 1: Some Issues in HPC

Automatic approaches

Intermezzo 2: Some Issues in HPC

ResilienceIntroductionMatrix productConjugate Gradient

Conclusion

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Critical Issues at Peta & Exascale for Algorithm and Software Design •  Synchronization-reducing algorithms

§  Break Fork-Join model

•  Communication-reducing algorithms §  Use methods which have lower bound on communication

•  Mixed precision methods §  2x speed of ops and 2x speed for data movement

•  Autotuning §  Today’s machines are too complicated, build “smarts” into

software to adapt to the hardware

•  Fault resilient algorithms §  Implement algorithms that can recover from failures/bit flips

•  Reproducibility of results §  Today we can’t guarantee this. We understand the issues,

but some of our “colleagues” have a hard time with this. Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Some issues in HPCcf. Dongarra (HPC Days, Lyon, April 2016)

Another big issue in developping HPC application:failures: on Sequoia Blue Gene/Q (number 1 of TOP500 in June2012), the node failure rate is 1.25 failures per day.

Resilience is needed.The approaches mentioned do not su�ce: a missing result can gounnoticed.

How can resilience be obtained?Two classical approaches: backward or forward (cf. Herault,Robert and Vivien, June 2016, 5th JLESC Joint Laboratory onExtreme Scale Computing - workshop).

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Agenda

Intermezzo 1: Some Issues in HPC

Automatic approaches

Intermezzo 2: Some Issues in HPC

ResilienceIntroductionMatrix productConjugate Gradient

Conclusion

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate GradientIntroduction ABFT for block LU factorization Composite approach: ABFT & Checkpointing

Backward Recovery vs. Forward Recovery

Backward Recovery

Rollback / Backward Recovery: returns in the history torecover from failures.

Spends time to re-execute computations

Rebuilds states already reached

Typical: checkpointing techniques

Thomas Herault, Yves Robert, and Frederic Vivien ABFT 5/ 45Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate GradientIntroduction ABFT for block LU factorization Composite approach: ABFT & Checkpointing

Backward Recovery vs. Forward Recovery

Forward Recovery

Forward Recovery: proceeds without returning

Pays additional costs during (failure-free) computation tomaintain consistent redundancy

Or pays additional computations when failures happen

General technique: Replication

Application-Specific techniques: Iterative algorithms withfixed point convergence, ABFT, ...

Thomas Herault, Yves Robert, and Frederic Vivien ABFT 5/ 45Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

ABFT?

ABFT stands for Algorithmic Based Fault Tolerance technique(Huang and Abraham, 1984).

The previous slides and the explanation below come from a courseby Herault, Robert and Vivien, June 2016 (5th JLESC JointLaboratory on Extreme Scale Computing - workshop).

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

ABFT?

ABFT stands for Algorithmic Based Fault Tolerance technique(Huang and Abraham, 1984).

The previous slides and the explanation below come from a courseby Herault, Robert and Vivien, June 2016 (5th JLESC JointLaboratory on Extreme Scale Computing - workshop).

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

ABFT and checksum techniqueIf A is a matrix, let us consider

A

0 =

✓A A · e

e

T · A e

T · A · e

◆,

e.g. A =

0

@1 2 34 5 69 8 7

1

A and A

0 =

0

BB@

1 2 3 64 5 6 159 8 7 2414 15 16 45

1

CCA .

The added row contains the sum of the elements of “its” column,the added column contains the sum of the elements of “its” row.

If an element is missing/lost, it is possible to detect it: check thechecksums, and to recover it: it is the di↵erence between theexpected checksums and the checksums.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

ABFT and checksum technique

What happens with floating-point arithmetic? The checksumsare equal to the sums of the corresponding elements, up toroundo↵ errors.

Let’s compute them using interval arithmetic?Using mid-rad representation with mid and rad in singleprecision?

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

ABFT: matrix multiplication

If A and B are two matrices and C = A · B , let us consider

A

0 =

✓A

e

T · A

◆,B 0 =

�B B · e

�,

and C

0 = A

0 · B 0 =

✓A · B A · B · e

e

T · A · B e

T · A · B · e

◆.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

ABFT: matrix multiplicationWhat happens with floating-point arithmetic? The checksumsare equal to the sums of the corresponding elements, up toroundo↵ errors.

Let’s compute them using interval arithmetic?Using mid-rad representation with mid and rad in single precision?

Let the matrix multiplication be performed in floating-pointarithmetic. A bound on the roundo↵ error is known:

|A · B � fl(A · B)| nu|A · B | (Jeannerod & Rump 2013)

re-compute the last row/column and check it against thecomputed one: they match if they are equal up to the bound.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (1/3)Hestenes and Stiefel, 1952, based on Lanczos’ approach

Let A be a symmetric positive definite matrix and b a vector.

Goal: find x such that Ax = b.

Equivalent problem:

min J(x) =1

2x

TAx � x

Tb.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (1/3)Hestenes and Stiefel, 1952, based on Lanczos’ approach

Let A be a symmetric positive definite matrix and b a vector.

Goal: find x such that Ax = b.

Equivalent problem:

min J(x) =1

2x

TAx � x

Tb.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (2/3)Let x0 be an initial vector.Gradient of J: Ax � b = �r where r is the residual.

Gradient method: go in the direction of the gradient:

xi+1 = xi � �i ri where ri = b � Axi .

Even better: conjugate gradient method: go in a direction p

related to the gradient, and conjugate to all precedent directions:

xi+1 = xi + ↵ipi where pi = ri �i�1X

j=1

�jpj ,

such that pTi Apj = 0 for i 6= j .This method converges in n steps if n is the dimension of A.In practice: iterative method, one stops (well) before reaching n.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (2/3)Let x0 be an initial vector.Gradient of J: Ax � b = �r where r is the residual.

Gradient method: go in the direction of the gradient:

xi+1 = xi � �i ri where ri = b � Axi .

Even better: conjugate gradient method: go in a direction p

related to the gradient, and conjugate to all precedent directions:

xi+1 = xi + ↵ipi where pi = ri �i�1X

j=1

�jpj ,

such that pTi Apj = 0 for i 6= j .This method converges in n steps if n is the dimension of A.In practice: iterative method, one stops (well) before reaching n.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (2/3)Let x0 be an initial vector.Gradient of J: Ax � b = �r where r is the residual.

Gradient method: go in the direction of the gradient:

xi+1 = xi � �i ri where ri = b � Axi .

Even better: conjugate gradient method: go in a direction p

related to the gradient, and conjugate to all precedent directions:

xi+1 = xi + ↵ipi where pi = ri �i�1X

j=1

�jpj ,

such that pTi Apj = 0 for i 6= j .This method converges in n steps if n is the dimension of A.In practice: iterative method, one stops (well) before reaching n.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (2/3)Let x0 be an initial vector.Gradient of J: Ax � b = �r where r is the residual.

Gradient method: go in the direction of the gradient:

xi+1 = xi � �i ri where ri = b � Axi .

Even better: conjugate gradient method: go in a direction p

related to the gradient, and conjugate to all precedent directions:

xi+1 = xi + ↵ipi where pi = ri �i�1X

j=1

�jpj ,

such that pTi Apj = 0 for i 6= j .This method converges in n steps if n is the dimension of A.In practice: iterative method, one stops (well) before reaching n.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (2/3)Let x0 be an initial vector.Gradient of J: Ax � b = �r where r is the residual.

Gradient method: go in the direction of the gradient:

xi+1 = xi � �i ri where ri = b � Axi .

Even better: conjugate gradient method: go in a direction p

related to the gradient, and conjugate to all precedent directions:

xi+1 = xi + ↵ipi where pi = ri �i�1X

j=1

�jpj ,

such that pTi Apj = 0 for i 6= j .This method converges in n steps if n is the dimension of A.In practice: iterative method, one stops (well) before reaching n.

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (3/3)Algorithm (CG)

Input: A 2 Rn⇥n, b 2 Rn, x0 2 Rn

r0 = b � Ax0, p0 = r0

for (i = 0, . . .)si = Api

↵i = r

Ti ri/sTi pi

xi+1 = xi + ↵ipi

ri+1 = b � Axi+1

�i+1 = r

Ti+1ri+1/rTi ri

pi+1 = ri+1 + �i+1pi

Output: xi+1

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (3/3)Algorithm (CG)

Input: A 2 Rn⇥n, b 2 Rn, x0 2 Rn

r0 = b � Ax0, p0 = r0

for (i = 0, . . .)si = Api

↵i = r

Ti ri/sTi pi

xi+1 = xi + ↵ipi

ri+1 = b � Axi+1

�i+1 = r

Ti+1ri+1/rTi ri

pi+1 = ri+1 + �i+1pi

Output: xi+1

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (3/3)Algorithm (CG)

Input: A 2 Rn⇥n, b 2 Rn, x0 2 Rn

r0 = b � Ax0, p0 = r0

for (i = 0, . . .)si = Api

↵i = r

Ti ri/sTi pi

xi+1 = xi + ↵ipi

ri+1 = ri � ↵i si

�i+1 = r

Ti+1ri+1/rTi ri

pi+1 = ri+1 + �i+1pi

Output: xi+1

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (3/3)Algorithm (CG)

Input: A 2 Rn⇥n, b 2 Rn, x0 2 Rn

r0 = b � Ax0, p0 = r0

for (i = 0, . . .)si = Api

↵i = r

Ti ri/sTi pi

xi+1 = xi + ↵ipi

ri+1 = ri � ↵i si

if (error on ri+1 too large)

ri+1 = b � Axi+1

�i+1 = r

Ti+1ri+1/rTi ri

pi+1 = ri+1 + �i+1pi

Output: xi+1

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (3/3)Algorithm (CG)

Input: A 2 Rn⇥n, b 2 Rn, x0 2 Rn

r0 = b � Ax0, p0 = r0

for (i = 0, . . .)si = Api

checksum for si = Api : re-compute si if needed

↵i = r

Ti ri/sTi pi

xi+1 = xi + ↵ipi

ri+1 = ri � ↵i si

if (error on ri+1 too large)

ri+1 = b � Axi+1

�i+1 = r

Ti+1ri+1/rTi ri

pi+1 = ri+1 + �i+1pi

Output: xi+1

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

Reminder: Conjugate Gradient (3/3)Algorithm (CG)

Input: A 2 Rn⇥n, b 2 Rn, x0 2 Rn

r0 = b � Ax0, p0 = r0

for (i = 0, . . .)si = Api

checksum for si = Api : re-compute si if needed

↵i = r

Ti ri/sTi pi

xi+1 = xi + ↵ipi

ri+1 = ri � ↵i si

if (error on ri+1 too large)

ri+1 = b � Axi+1

�i+1 = r

Ti+1ri+1/rTi ri

pi+1 = ri+1 + �i+1pi

Output: xi+1

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

IntroductionMatrix productConjugate Gradient

ABFT and Conjugate Gradient

Initial goal: resilient CG.

Result: resilient and more accurate CG.

(Work by E.F. Yetkin, E. Agullo, S. Cools, L. Giraud, W.Vanroose, “Soft Errors in CG: Detection and Correction”,SIAM Parallel Processing, 2016.)

Could the error on the residual computed by recurrence bebounded by interval computations?

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Agenda

Intermezzo 1: Some Issues in HPC

Automatic approaches

Intermezzo 2: Some Issues in HPC

ResilienceIntroductionMatrix productConjugate Gradient

Conclusion

Nathalie Revol HPC and Interval Computations

Intermezzo 1: Some Issues in HPCAutomatic approaches

Intermezzo 2: Some Issues in HPCResilienceConclusion

Conclusion: some thoughts. . .on why and how interval computations could benefit HPC:

I comparison of di↵erent results amounts to a non-emptyintersection,

I detection of failures also,I if fast interval computations are developed.

on the complex relationship between HPC and IA:I HPC not welcoming to interval computations (rounding

modes are not respected),I but interval computations would be well-suited for HPC (high

numerical intensity);I interval arithmetic may be useful to HPC,I but it is not yet clear how to implement it in an optimized way.

This conclusion contains more future work than conclusions!Nathalie Revol HPC and Interval Computations