16
Dependency Injection for Programming by Optimization Zoltan A. Kocsis 1 and Jerry Swan 2 1. School of Mathematics, University of Manchester, Oxford Road, Manchester M13 9PL, UK. [email protected] 2. Computer Science, University of York, Deramore Lane, York, YO10 5GH, UK. July 14, 2017 Abstract Programming by Optimization tools perform auto- matic software configuration according to the specifi- cation supplied by a software developer. Developers specify design spaces for program components, and the onerous task of determining which configuration best suits a given use case is determined using au- tomated analysis tools and optimization heuristics. However, in current approaches to Programming by Optimization, design space specification and explo- ration relies on external configuration algorithms, ex- ecutable wrappers and fragile, preprocessed program- ming language extensions. Here we show that the architectural pattern of Dependency Injection provides a superior alterna- tive to the traditional Programming by Optimization pipeline. We demonstrate that configuration tools based on Dependency Injection fit naturally into the software development process, while requiring less overhead than current wrapper-based mechanisms. Furthermore, the structural correspondence between Dependency Injection and context-free grammars yields a new class of evolutionary metaheuristics for automated algorithm configuration. We found that the new heuristics significantly outperform existing configuration algorithms on many problems of inter- est (in one case by two orders of magnitude). We anticipate that these developments will make Pro- gramming by Optimization immediately applicable to a large number of enterprise software projects. 1 Introduction Proper configuration of software is a particularly challenging issue in both research and industry. In- teractions between design decisions have effects on performance and functionality that are difficult to predict. The observation that automated algorithm configuration and parameter tuning tools can sim- plify this task has led to a new software development paradigm: Programming by Optimization (PbO) [10]. Development in the PbO paradigm consists of spec- ifying large design spaces of program component implementations: the onerous task of determining which components work best in a given use case is achieved via automated analysis tools and optimiza- tion heuristics. The standard Programming by Optimization tools operate on design spaces specified in a specialized extension of a target programming language, trans- formed into the target language by a specialized weaver tool. The optimization choices over the com- bined design space are made by a separate algorithm configuration tool, which has historically been ap- 1 arXiv:1707.04016v1 [cs.AI] 13 Jul 2017

Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

  • Upload
    others

  • View
    13

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

Dependency Injection for

Programming by Optimization

Zoltan A. Kocsis1 and Jerry Swan2

1. School of Mathematics, University of Manchester,Oxford Road, Manchester M13 9PL, [email protected]. Computer Science, University of York,Deramore Lane, York, YO10 5GH, UK.

July 14, 2017

Abstract

Programming by Optimization tools perform auto-matic software configuration according to the specifi-cation supplied by a software developer. Developersspecify design spaces for program components, andthe onerous task of determining which configurationbest suits a given use case is determined using au-tomated analysis tools and optimization heuristics.However, in current approaches to Programming byOptimization, design space specification and explo-ration relies on external configuration algorithms, ex-ecutable wrappers and fragile, preprocessed program-ming language extensions.

Here we show that the architectural pattern ofDependency Injection provides a superior alterna-tive to the traditional Programming by Optimizationpipeline. We demonstrate that configuration toolsbased on Dependency Injection fit naturally into thesoftware development process, while requiring lessoverhead than current wrapper-based mechanisms.Furthermore, the structural correspondence betweenDependency Injection and context-free grammarsyields a new class of evolutionary metaheuristics forautomated algorithm configuration. We found thatthe new heuristics significantly outperform existingconfiguration algorithms on many problems of inter-est (in one case by two orders of magnitude). We

anticipate that these developments will make Pro-gramming by Optimization immediately applicableto a large number of enterprise software projects.

1 Introduction

Proper configuration of software is a particularlychallenging issue in both research and industry. In-teractions between design decisions have effects onperformance and functionality that are difficult topredict. The observation that automated algorithmconfiguration and parameter tuning tools can sim-plify this task has led to a new software developmentparadigm: Programming by Optimization (PbO) [10].Development in the PbO paradigm consists of spec-ifying large design spaces of program componentimplementations: the onerous task of determiningwhich components work best in a given use case isachieved via automated analysis tools and optimiza-tion heuristics.

The standard Programming by Optimization toolsoperate on design spaces specified in a specializedextension of a target programming language, trans-formed into the target language by a specializedweaver tool. The optimization choices over the com-bined design space are made by a separate algorithmconfiguration tool, which has historically been ap-

1

arX

iv:1

707.

0401

6v1

[cs

.AI]

13

Jul 2

017

Page 2: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

plied to the resulting executable program.The weaver-based architecture severely limits the

applicability of Programming by Optimization. Thereliance on markup extensions hinders the adoptionof PbO for existing code bases. In addition, the ex-ternal configuration tools have to operate on the exe-cutable via a brittle textual interface (command linearguments), which introduces significant overheadand makes on-line optimization difficult. Despitethese shortcomings, no new alternative to weavertools has been introduced since the initial PbO pro-posal.

We developed ContainAnt, a software library forProgramming by Optimization that addresses theselimitations by replacing syntactic extensions andweavers with the Dependency Injection architecturalpattern [18]. By exploiting a structural correspon-dence between Dependency Injection and context-free grammars, we obtain a new class of grammar-based evolutionary heuristics suitable for automatedalgorithm configuration. We determined that thesenew heuristics significantly outperform existing con-figuration algorithms on several common configura-tion tasks and optimization problems, both in termsof solution quality and execution speed (in one casereducing the optimization time from four hours to46 seconds).

This paper discusses the theory and implementa-tion of the ContainAnt library and its grammar-based heuristics. Sections 1.1 and 1.2 introduce De-pendency Injection and review the existing work onProgramming by Optimization. Section 2 describesthe theoretical correspondence between DependencyInjection and optimization problems over context-free grammars, while Section 3 gives novel heuristicsfor solving the resulting grammatical optimizationproblems using genetic algorithms and ant colonytechniques. The remainder of the paper analyzes fivedifferent experiments used to evalute the performanceof the ContainAnt heuristics.

1.1 Dependency Injection

Object-oriented software provides functionality viamultiple interdependent components. Software engi-neering efforts to handle the problems of dependency

instantiation and reference acquisition between thesecomponents has led to the widespread adoption of anew type of middleware library, the so-called Depen-dency Injection (DI) container [18]. The term “De-pendency Injection” was coined by Fowler [7] in 2004and DI containers have seen increasingly widespreaduse over the last decade, with popular frameworksincluding the JavaTM Spring framework1 and GoogleGuice2.

Software written using DI inherently exposeshighly structured configuration parameters: compo-nents are configured by searching over the space of de-pendencies, without modifying the source code of thecomponents themselves. The traditional operation ofa DI container is to perform the wiring between theconstructors of dependent objects (also known as the‘object graph’) by consulting a configuration objector file that contains a list of bindings between ab-stract types and their constructor arguments. Thecontainer then selects a target class and greedily sup-plies the dependencies to a suitable constructor ofthe target class. At this point, it is worth noting asignificant limitation of some popular DI containers(e.g. Guice): configuration is not possible if the objectgraph contains ambiguities such as a choice of mul-tiple subtypes of an abstract class. As described indetail in Section 3, the optimization based approachof ContainAnt removes this limitation.

1.2 Related Work

In their seminal work on Programming by Optimiza-tion, Hoos et al. [10] delineated five levels of PbO,ranging in sophistication from tuning the exposedparameters of an application (Level 0) to the useof evidence-based methods for exploring large designspaces as the driving activity for the software designprocess (Level 4).

To realize the higher levels of PbO, they introducedthe concept of a PbO-enhanced language: a supersetof an existing programming language (e.g. PbO-Javaor PbO-C) which includes constructs for declaringthe possible design choices for parameters and blocks

1http://projects.spring.io/spring-framework2https://github.com/google/guice

2

Page 3: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

of code. The code written in this markup languageis translated into the target language via a syntac-tic transformation performed by a specialized PbOweaver tool, reminiscent of a macro preprocessor.

The optimized choices (as determined over thecombined design spaces on a set of training cases) aremade by an external automatic configuration tool.Configuration optimizers have been proposed thatuse various heuristics, e.g. iterated local search [12],genetic algorithms [1] and iterated racing [14]. A no-table achievement of PbO is the development and useof the SMAC configuration tool [11] to improve uponthe state-of-the-art in SAT solving by tuning param-eters of the Spear SAT solver. While the majorityof configuration optimizers are model-free, SMAC al-ternates between building a regression model to pre-dict configuration performance and gathering addi-tional performance data based on this model. Theregression model is obtained via random forests, amethod which is known to perform well on categori-cal variables and also allows quantification of uncer-tainty.

Search Based Software Engineering (SBSE) [9] isthe application of heuristic search to various aspectsof the software development process, with a stronghistorical emphasis on software testing. Much re-cent interest within SBSE has focused on ‘embeddedadaptivity’ [8], i.e. allowing software developers todelegate the configuration/generation of specified as-pects of program functionality to heuristic search pro-cedures [4]. Such SBSE activity is therefore stronglyaligned with the previously stated goals of PbO, butoften with emphasis on a generation process whichcan respond dynamically to changes in the operat-ing environment of the program. Previous work inthis area includes Gen-O-Fix [26] and ECSELR [29],both of which are embedded monitor systems thatsupport search via Evolutionary Computation. Tem-plar and Polytope are two alternative approachesto software component generation: Templar [25]provides a ‘top-down’ framework for orchestratingone or more ‘variation points’ generated by GeneticProgramming, while Polytope [27] uses methodsfrom datatype generic programming to support the‘bottom up’ generation of individual variation pointsin source code.

Since Dependency Injection containers already au-tomate a non-trivial part of the Software Engineeringprocess, they provide a natural entry point for theapplication of heuristic methods from SBSE.

2 Grammatical Optimization

Backus-Naur Form (BNF) is a widely adopted syn-tax for describing context-free languages. Forthe sake of technical convenience (the ability tohave different rewrite rules with identical bodies),we present a slight variation of the usual notion,the labeled BNF formalism introduced by Fors-berg and Ranta [6]. Formally, such a grammar Gconsists of the following components:

• A set of terminal symbols GT . These are theliterals or words that make up the language.

• A pointed set of non-terminal symbols GN ,with a distinguished start symbol s ∈ GN .These categorize the sub-expressions of the lan-guage.

• A set of rewrite rules GR. Normally, eachrewrite rule has the form (a, b) where a ∈ GNand b is a sequence of symbols from GT ∪ GN .Since we are dealing with labeled BNF, rewriterules have the form (`, a, b) where ` is a uniquelabel, the left-hand side a is a non-terminaland the right-hand side b is a finite sequenceof symbols from GT ∪ GN .

At this point, it is customary to introduce the no-tion of sentence: a sequence of terminal symbols ob-tained from the start symbol s by applying a sequenceof rewrite rules, i.e. by replacing non-terminals withthe right-hand sides of the corresponding rewriterules. Such a sequence of rules can be represented asa rooted tree known as a derivation tree. A grammaris unambiguous if each of its sentences has a uniquecorresponding derivation tree.

In practice, the actual sentences of the languageturn out to be immaterial from the perspective of agrammatical optimization problem, so it is simplerto work from a direct definition of derivation trees.Thus we ignore the underlying sentences altogether

3

Page 4: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

and inductively define a derivation tree of sort x ∈ GNto consist of the following data:

• A rewrite rule of the form (`, x, b),

• A derivation tree of sort a ∈ GN for every non-terminal symbol a in the sequence b.

From here on, all derivation trees are assumed tohave sort s (the start symbol of the grammar G). Theset of all such derivation trees is denoted D(G).

Grammars can be specified by listing their rewriterules in the following format:

Label. <LHS> ::= RHS

where angled brackets are used to distinguish be-tween terminals and non-terminals.Two elementary examples follow:

2.0.1 Binary Strings

The grammar of binary strings is given by:

GT = {0, 1, e} ,GN = {s} ,GR = {(0, s, 0s), (1, s, 1s), (e, s, e)} .

Using the shorthand defined above, the rewriterules could also be written as

0. <s> ::= 0 <s>

1. <s> ::= 1 <s>

e. <s> ::= e

Derivation trees for this grammar correspond tofinite sequences of binary digits (with e being theterminating character). A grammar of strings overany given finite alphabet can be defined analogously.

2.0.2 Finite Sets

Any finite set S gives rise to a grammar by setting

GT = S,

GN = {s} ,GR = {(x, s, x) | x ∈ S} .

The derivation trees of this grammar are in bijec-tive correspondence with elements of the set S.

2.1 Problem Statement

An instance of the grammatical optimizationproblem is given by the following data:

• A grammar G and

• An objective function f : D(G) → R definedon the derivation trees of the grammar G.

Without loss of generality, we assume that our goalis maximizing the objective function, i.e. solving thegrammatical optimization problem consists of findinga globally optimal derivation tree:

x∗ = arg maxx∈D(G)

f(x).

The definition above is extremely general: indeed,every discrete optimization problem can be reducedto the grammatical optimization problem over thegrammar of binary strings.

2.2 Semantics

One can reduce an optimization problem instancewith candidate solutions S and objective function f :S → R to an instance of the grammatical optimiza-tion problem by giving an encoding grammar G anda surjective function

k : D(G)→→ S

with surjectivity ensuring that every candidate so-lution is described by at least one sentence of thelanguage.

To forbid ad-hoc encodings (e.g. the encoding ofany discrete optimization problem into the grammarof binary strings discussed above), one should thinkof the function k as giving a semantics to the sen-tences of the language defined by the grammar G.From here on, we demand that the semantics be com-positional: the meaning of a derivation tree should begiven in terms of the meanings of its parts (direct sub-trees). The compositionality requirement provides aformal counterpart to the intuitive desideratum thatthe structure of the grammar be related to the struc-ture of the search space S, without ruling out anyinteresting grammatical representations.

4

Page 5: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

We will shortly see that both dependency injec-tion and the algorithm configuration problem havesensible, compositional representations as instancesof the grammatical optimization problem. What’smore, the same holds for many problems of interestin both continuous and combinatorial optimization.

2.3 Rosetta Stone

Analyzing the process of dependency injection leadsto a powerful “dictionary” correlating the terminol-ogy of grammars with the terminology of object-oriented programming. If the goal is to instantiatean object of some given class C, one first has to finda constructor of C (if the class has no constructors,instantiation is impossible). In turn, the selected con-structor will expose zero or more classes as depen-dencies. If the selected constructor c() has no depen-dencies, the object can be instantiated directly bycalling c(). However, if there are one or more depen-dencies D1, D2, . . . , one has to recursively instantiateobjects d1, d2, . . . compatible with the given classesbefore calling c(d1, d2, . . . ) to instantiate an object ofclass C.

Now, let G be a grammar. To construct a deriva-tion tree of some given sort s ∈ GN , one starts bychoosing a rewrite rule with left-hand side s (no suit-able tree can exist in the absence of such a rule). Ifthe right-hand side of the chosen rewrite rule con-tains no non-terminals, the construction is finished.However, if the right-hand side contains one or morenon-terminals n1, n2, · · · ∈ GN , one has to recursivelyconstruct a derivation tree for each sort ni beforeconstructing the derivation tree for the target sort s.

The structure of the algorithms for dependencyinjection and derivation tree construction (Algo-rithms 1 and 2) turn out to be nigh-identical. Thissuggests an analogy between dependency injectionand grammatical optimization, with classes corre-sponding to non-terminals, constructors correspond-ing to rewrite rules and constants corresponding toterminals. Thus, the grammatical rewrite rule corre-sponding to the constructor (Java syntax)

T ctor(T1 a1, T2 a2, . . . )

under this assignment is simply

ctor. <T> ::= ctor <T1> <T2> [...]

With this correspondence in mind, we can nowrecast dependency injection as a grammatical deci-sion/optimization problem.

Given a grammar G, deciding whether D(G) = ∅amounts to solving a dependency injection problem.The correspondence gives rise to a semantics assign-ing the constructed object to each derivation tree ofthe grammar. In the sequel, this is referred to as theusual semantics.

Algorithm 1 Class instantiation using DependencyInjection

function i n s t a n t i a t e ( t : Class )for c in t . c on s t ru c t o r s{ t r y to cons t ruc t each argument r e c u r s i v e l y }for i := 0 to c . arguments . l ength

args [ i ] := i n s t a n t i a t e ( c l a s sO f ( a ) )end for{ i f r e cu r s i v e c a l l s succeed }i f ! a rgs . conta in s ( nu l l )

r e turn c ( args ) { c a l l cons t ruc tor }end i f{ e l s e t r y the next cons t ruc tor }

end forr e turn nu l l

end function

Algorithm 2 Recursive Derivation Tree Construc-tion

function cons t ruc t ( t : Sort )for c in t . r ewr i t eRu l e s{ t r y to cons t ruc t each sub t r e e r e c u r s i v e l y }for i := 0 to c . nontermina l s . count

sub t r e e s [ i ] := cons t ruc t ( sortOf ( i ) )end for{ i f r e cu r s i v e c a l l s succeed }i f ! s ub t r e e s . conta in s ( nu l l )

r e turn Tree ( c , sub t r e e s )end i f{ e l s e t r y the next r ewr i t e ru l e }

end forr e turn nu l l

end function

5

Page 6: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

3 Heuristics

ContainAnt is, first and foremost, a DependencyInjection library. In order to be as widely applicableas Programming by Optimization, the default heuris-tics of ContainAnt cannot be problem-specific:they have to operate at the level of problem de-scriptions. Metaheuristics that only exist as nature-inspired metaphors or informal algorithm templates(i.e. without the ability to automatically transform aproblem specification into a working implementation)are insufficient for this. These requirements leave uswith a rather small class of suitable metaheuristics,which we now describe.

3.1 Genetic Programming: GrEvo

Incorporating context-free grammars into geneticprogramming was proposed by Ryan et al. [21]. Theirseminal work on Grammatical Evolution allowed theelimination of the closure requirement, a major draw-back of untyped Genetic Programming, which re-quired all functions to be able to accept as inputthe outputs of all other functions. The genotypesare numerical sequences, translated into sentences ofa BNF grammar using the mapping of Algorithm 3.Transcribed into the derivation tree formalism of Sec-tion 2, the genotypes encode the choice of rewrite ruleat each recursive step of the derivation tree construc-tion (Algorithm 1).

Algorithm 3 GrEvo Genotype-Phenotype Map-ping

function getPhenotype ( g : L i s t [ Int ] , t : Sort ){ choose r ewr i t e ru l e based on genotype }c := t . r ewr i t eRu l e s [ g . head ]g := g . t a i lfor i := 0 to c . nontermina l s . count

sub t r e e s [ i ] := getPhenotype (g , sortOf ( i ) )end forr e turn Tree ( c , sub t r e e s )

end function

Since Grammatical Evolution allows the generationof syntactically correct sentences in an arbitrary lan-guage, its implementations are not tied to any spe-

cific problem, and are able to operate on any for-mal grammar specification. Grammatical Evolutionremains the most popular metaheuristic of its kind,generally outperforming derivate algorithms such asGrammatical Swarm [17].

The ContainAnt distribution includes an imple-mentation of the Grammatical Evolution metaheuris-tic with fixed-length genotypes for solving the gram-matical optimization problem. This implementationis henceforth called GrEvo. The performance anal-ysis (Section 5.3) shows that some characteristics ofGrEvo, such as its premature convergence and poorlocality, make it suboptimal for tackling the gram-matical optimization problem. This limitation moti-vates the novel grammar-based heuristic introducedbelow.

3.2 Ant Programming: GrAnt

Ant colony optimization methods are the main al-ternative to Genetic Programming for the auto-mated production of computer programs via stochas-tic search. Ant Programming based on BNF gram-mars has been investigated by Keber and Schus-ter [13] under the name Generalized Ant Program-ming (GAP) in the context of option pricing, andlater by Salehi-Abari and White [22] for general au-tomatic programming (EGAP). The development ofthese heuristics led to what has been called an “up-hill battle” between the two methods, while geneticprogramming was found to be statistically superiorto EGAP [23].

Here, we describe a novel ant colony algo-rithm (GrAnt) for solving the grammatical op-timization problem that significantly outperformsGrammatical Evolution on diverse optimizationproblems. The new heuristic is based on the MIN-MAX Ant System [24], but differs from previous AntProgramming algorithms on two key points:

1. The pheromone levels (associated with rewriterules) are bounded between a minimum andmaximum pheromone value. However, the max-imum is treated as a soft bound that can bechanged by specific events over the course of thesearch.

6

Page 7: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

2. Each ant constructs a complete derivation tree ina depth-first, targeted fashion (cf. EGAP’a useof partial sentences and non-terminals).

GrAnt (Algorithm 4) maintains a pheromonetable, holding a pheromone level lying between ahard minimum level τmin, and a soft maximumτmax for every rewrite rule. A search iteration be-gins with each ant constructing a derivation tree ofthe target sort.The construction proceeds by recur-sively choosing rewrite rules using simple pheromone-proportional selection. The fitness of the constructedtrees is calculated, pheromones are updated by ap-plying evaporation. The iteration-best ant is allowedto deposit pheromones by adding the fitness value tothe pheromone level of each rewrite rule used in thederivation. If the iteration-best fitness ever exceedsτmax, then τmax is updated to the higher value. Themotivation for this behavior is assigning more weightto pheromone increases caused by finding fit solu-tions vs. pheromone buildup caused by repeatedlyexploring an area of the search space. As an addi-tional benefit, this eliminates the need for normaliz-ing the amount of pheromones on the edges (shaking).Upon reaching the stopping condition, the algorithmreturns the overall best solution found.

4 Implementation

ContainAnt is implemented as a Dependency In-jection library for the Scala programming lan-guage. The statically typed, object-oriented na-ture of Scala makes it well-suited for Depen-dency Injection, and its run-time reflection facili-ties tremendously simplify the ContainAnt archi-tecture. Moreover, Scala runs on the Java VirtualMachine, allowing the library to work with code baseswritten in any JVM language (including Clojureand Java).

ContainAnt’s job is assembling objects and ob-ject graphs. In effect, the library takes over object in-stantiation. Instead of using the new keyword with aconstructor to instantiate classes, the programmer re-quests an instance of a given class from ContainAnt(ContainAnt create[ClassName]). The container

Algorithm 4 GrAnt Heuristic

function grant ( t : Sort )while ( ! stopped )

s o l v := cons t ruc t (p , t ) {wlog 1 ant}i t e r := f i t n e s sO f ( s o l v )evaporatePheromone ( )for r u l e in s o l v

addPheromone ( ru le , i t e r )end for{update max pheromone}i f i t e r > tau max then

tau max := i t e rend i f{update b e s t s o l u t i on }i f f i t n e s sO f ( bes t ) > i t e r then

best := so l vend i f

end whiler e turn best

end function

{ r e cu r s i v e path cons t ruc t i on }function cons t ruc t (p : Pheromones , t : Sort ){pheromone−propor t i ona l ru l e s e l e c t i o n }c := p . s e l e c t ( r ewr i t eRu l e s ( t a r g e t ) ){ cons t ruc t sub t r e e f o r each non−t ermina lo f the s e l e c t e d ru l e }for i := 0 to c . nontermina l s . count

sub t r e e s [ i ] := cons t ruc t (p , sortOf ( i ) )end forr e turn Tree ( c , sub t r e e s )

end function

7

Page 8: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

then heuristically determines what to build by resolv-ing dependencies, choosing appropriate constructorsand wiring everything together.

To take advantage of the heuristic capabilities,the programmer has to supply an objective func-tion. With the exception of this objective func-tion, the configuration of ContainAnt is modeledon Google’s popular Guice dependency injection li-brary. The programmer provides a Module (a plainobject implementing a marker trait) containing theconstructors and helper functions to be used dur-ing Dependency Injection. If the software to beoptimized uses Dependency Injection, these mod-ules will already be present, ready to be used byContainAnt. This is in strict contrast with theweaver approach to Programming by Optimization:weaver rules are not present in programs that werenot designed with the corresponding PbO toolset inmind.

ContainAnt parses module specifications usingScala’s reflection capabilities, turning the Depen-dency Injection problem into a grammatical opti-mization instance. Our analysis (Section 5.3) indi-cates that the default GrAnt search heuristic sufficesto solve many optimization and algorithm configura-tion problems without problem-specific tuning. Thismeans that using ContainAnt does not require thepractitioner to deal with grammars, or even beingaware of the heuristics working “under the hood”.

Since ContainAnt acts like an ordinary depen-dency injection container, taking over the instantia-tion of objects and resolution of dependencies, it neednot distinguish between off-line and on-line adaptiveoptimization: the distinction can be made by usingan embedded wrapper to select between ‘constructon first use’ or dynamic/periodic reconstruction [4].

There are no major obstacles to turning the con-tainer into a drop-in replacement for Guice by im-plementing the complete Guice API, thus makingPbO immediately available to hundreds of enterprisesoftware projects. This is possibly the most im-portant application of the correspondence detailedin Section 2, and the main future target of Con-tainAnt development.

5 Case Studies

To demonstrate the general behavior of Con-tainAnt and SMAC [11], and to compare the per-formance of their heuristics, we implemented twoclassical optimization problems (Branin function,Subset Sum) and three algorithm configuration prob-lems (D-ary heaps, skiplists and syntax highlighting).For comparison purposes, one problem of each classwas also implemented for use with SMAC. In thissection, we offer a detailed look at each problem,followed by a performance comparison showing thatGrAnt significantly outperform the other heuristicsin all but one of these problems.

5.1 Classical Problems

5.1.1 Branin Function

In this first case study, we compare ContainAntwith SMAC on a global optimization problem. Thegoal is to minimize the value of the Branin functionon a given bounded subset of the Euclidean plane.The Branin function (introduced by Dixon and Szegoin their traditional optimization test suit [5]) has longbeen a popular benchmark for continuous optimiza-tion heuristics. The function has the form

branin(x1, x2) =

((x2 −

5.1

4π2

)x21 +

5

πx1 − 6

)2

+

10

(1− 1

)cos(x1) + 10

with the domain restricted so that x1 ∈ [−5, 10] andx2 ∈ [0, 15]. There are three global minima on thisdomain, each with value ∼ 0.397 = 2.48−1.

The Branin function provides an ideal contextfor comparing the behavior and the performance ofSMAC and ContainAnt, since the SMAC distri-bution already includes a configuration for optimiz-ing the Branin function in one of the default examplescenarios.

There are many practical techniques for represent-ing a continuous solution space as a BNF grammar.The most intuitive way is including a sufficiently fine“uniform grid” of constants from the domain as ter-minals of the grammar. Alternatively, the grammar

8

Page 9: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

of binary strings presented in Section 2 can representevery dyadic fraction in a compact interval. Dyadicfractions form a dense subset of the interval and pro-vide arbitrary-precision approximations to any givennumber. We decided to go with the former, more in-tuitive grammar for this experiment. The result isa large grammar with many terminals, but one thataligns very well with SMAC’s solution representa-tion, thereby ensuring that both heuristics exploresearch spaces of the same size, which leads to a com-pletely fair comparison.

5.1.2 Subset Sum

Given a finite set of integers S ⊆ Z and target num-ber c ∈ Z, is there a subset I ⊆ S such that∑

i∈Ii = c ?

Known as “subset sum”, this is one of the ur-examples of an NP-complete decision problem. Re-cast as an optimization problem, we will attemptto maximize the function f(I) = |c−

∑I|−1 with

f(I) = 2 if∑I = c. We use two subset sum bench-

mark instances (P01 and P03) from Burkardt’s Sci-entific Computing Dataset [2] for this case study.

The grammar for this instance consists of the finitegrammar generated by the numbers in S, along withthe following generic rewrite rules for constructingsets of numbers:

empty. <Set> ::= empty

add. <Set> ::= add <Int> <Set>

with the obvious compositional semantics

k(empty) = ∅k(add y z) = {k(y)} ∪ k(z)

Notice that the argument-passing system of SMACwould not be capable of supplying arguments of thiscomplexity. The experiment is limited to the Con-tainAnt heuristics, with 100 runs and the heuristicscapped at 1000 objective function evaluations.

5.2 Programming by Optimization

5.2.1 D-ary Heaps

A min-heap (resp. max-heap) structure is a rootedtree in which every node has a value larger (smaller)than the value of its parent. A D-ary heap is a heapstructure built on a complete D-ary tree. The famil-iar binary heaps are D-ary heaps with D = 2. Gen-eral D-ary heaps allow faster key update operationsthan the binary case — O (logD n) vs. O (log2 n).This makes D-ary min-heaps (resp. max-heaps) ap-propriate for algorithms where decrease (increase)operations are more common than minimum (max-imum) extraction.

Generalizing the binary case, the underlying treecan always be implemented as an array, with the chil-dren of the ith node placed at indices iD + 1, iD +2, . . . , iD + D. This implementation strategy im-proves cache efficiency and enables random access.There is a performance trade-off, however: the arraywill eventually fill up, triggering an expensive resizeoperation.

A D-ary heap data structure implemented with ar-rays has three parameters: the initial size of the ar-ray, the expansion factor of the resize operation, and(of course) the arity D. The optimal values of theseparameters depend on the expected number of valuesto be stored in the structure, as well as the expecteddistribution of decrease/increase and minimum/max-imum extraction operations.

The optimization of D-ary heaps was implementedby Hoos and Hsu as a test instance for the originalProgramming by Optimization proposal. The orig-inal code is written in an extended dialect of theJava programming language, designed for use witha PbO weaver. The weaver-specific declarations haveto be factored out into constructor arguments - amere three lines of changes, one for each parame-ter described above. The resulting standard Java isdirectly usable by ContainAnt.

The grammar for the data structure configurationproblem consists of the constructor for the dynamicheap class as the only proper rewrite rule; there areclasses and constants for heaps, their arities, expan-sion factors and initial sizes, all of them equipped

9

Page 10: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

Figure 1: A ternary min-heap and its array represen-tation.

with their usual semantics. The objective functioncounts the number of accesses to the underlying ar-ray (with each resize operation counting as two ac-cesses for each index, in line with the usual amortizedanalysis for array lists) under a given test load. Eval-uating the objective function for this task is very ex-pensive, so the experiment is limited to 10 runs, withthe heuristics capped at 1000 objective function eval-uations.

5.2.2 Skiplists

Skiplists are a probabilitistic alternative to balancedbinary search trees [19]. Skiplists are essentially or-dered linked lists where each node may contain mul-tiple forward links. In the familiar linked list, a nodeconsists of a value (a piece of data) and a link to thenext node. Nodes in a skiplist contain a whole hierar-chy of links, each one pointing to a farther subsequentnode than the one below it. These auxiliary linksprovide an “express lane” for navigating the struc-ture and can be exploited to implement all three dic-tionary operations (insertion, lookup and deletion ofvalues) with logarithmic expected time complexity.Thus, the performance of skiplists is comparable tothat of balanced binary search trees.

Skiplists are parametrized by two numeric values:the transition probability p ∈ R and the maximalheight of the hierarchy h ∈ N. To find a given value vin a skiplist, start by following the highest level links

Figure 2: Skiplist with a three-layer hierarchy.

of the hierarchy, advancing until either v is encoun-tered, or the value of the next node is greater thanv. In the latter case, continue the search by follow-ing links one level down in the hierarchy. To inserta given value v into a skiplist, start by finding itslocation using the method described above. Createa node for storing v. Now, generate a uniform ran-dom real x ∈ [0, 1] and link the newly created nodeto its neighbors in level ` of the hierarchy if and onlyif pk > `. When p = 0.5, one can intuitively think ofthis process as a series of coin flips. If you get heads,you link the node to its neighbors on level ` of thehierarchy, then repeat the procedure on level ` + 1.If you get tails or reach the maximum height ` = h,the insertion operation ends.

Instead of having a fixed parameter p, where theprobability of inserting a value into level k of the hier-archy is always 1

p−k , one can consider a more generalskiplist architecture, where this probability is givenby 1

Pk, where P : N → R+ is an arbitrary monotone

sequence. In the experiment, we will focus on threedifferent types of sequences:

• Geometric: Pk = ak for some a > 1,

• Arithmetic: Pk = a+ k for some a > 0 and

• Sums of the previous two types.

Hence, our skiplists will have two parameters: themaximum height h, and the probability sequence P .The expected time complexity of lookups is indepen-dent of the distribution of the values [15]. However,the optimal choices of the parameters P and h do de-pend on the expected number of items to be stored inthe skiplist. Skiplists are often stored in a distributedfashion, where the optimal configuration may furtherdepend on variables such as network latency, givingrise to an on-line data structure configuration prob-lem.

10

Page 11: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

ContainAnt is readily able to solve this parame-ter tuning problem — indeed, we have already evalu-ated this capability on a much larger search space inSection 5.2.1. However, we can use optimization toexplore a more interesting search space by consider-ing a generalized variant of skiplists.

Let p denote the (non-terminal corresponding to)the class of integer sequences. The grammar for thisdata structure configuration problem has a rewriterule corresponding to the constructor of the skiplistclass, as well as three special rewrite rules for con-structing the probability sequences:

geom. <Prob> ::= geom <Double>

arit. <Prob> ::= arit <Double>

sum. <Prob’> ::= sum <Prob> <Prob>

The compositional semantics assigns

k(geom y) = (1, k(y), k(y)2, k(y)3, . . . )

k(arit y) = (1, 1 + k(y), 1 + 2k(y), . . . )

k(sum y z) = k(y)⊕ k(z)

where the symbol ⊕ denotes the termwise sum oftwo sequences. As in the other grammars, there areconstructors for skiplists and constants for the nu-merical parameters, all of them equipped with theirusual semantics. This shows that the grammati-cal approach can conveniently represent sophisticatedsearch spaces that would be difficult and sometimesimpossible to describe via SMAC’s text-based config-uration files. The objective function fills the skipliststructure with 1000 random values, and performs 100random lookups, measuring the total number of com-parisons performed. All heuristics are capped at 100objective function evaluations. The search is fastenough to make 100 runs of the experiment feasible.

5.2.3 Syntax Highlighting

This final case study serves to showcase a practicaluse case for Programming by Optimization in gen-eral and ContainAnt in particular: the creation ofsoftware with search-based “dynamic adaptive” fea-tures. Our minimal example is a syntax highlighterthat automatically adjusts itself to different displayenvironments. The potential applications include

battery-saving color schemes compatible across dif-ferent devices (using the technique of Burles et al. [3]to incorporate energy consumption into the objectivefunction) and schemes that remain readable whentransplanted to different environments (e.g. embed-ded into social media or displayed by the fixed back-ground color “webview” of a mobile application).

Agda is an increasingly popular dependentlytyped programming language designed by Ulf Norell[16]. The Agda compiler can generate documenta-tion web pages which include the navigable, syntax-highlighted source code of the compiled software. Un-fortunately, the default color scheme for the syn-tax highlighting is unreadable on dark backgrounds,which causes problems when embedding the gener-ated documentation into a larger website.

Our test program generates a readable colorscheme for Agda documentation given a target back-ground color as input. The program consists of littlemore than a naive fitness function quantifying thereadability of a color scheme by penalizing low con-trast and by rewarding color schemes based arounda small number of complementary colors. All of thesearch is relegated to either SMAC or ContainAnt.The former requires a configuration file with 27 cat-egorical variables, each with 27 options. In addition,about 100 lines of boilerplate code had to be writtenfor handling command line arguments and interfac-ing with SMAC. For ContainAnt, the grammarspecification, consisting of the constructors for theColorScheme and RGBValue classes, takes 37 lines al-together. The heuristics are capped at 1000 objectivefunction evaluations.

5.3 Analysis

All experiments were performed on the following sys-tem:

• CPU: Intel Xeon E5-2676 clocked at 2.40GHzwith 30 MB Level 3 cache,

• RAM: 1019280k total,

• Swap: disabled,

• JVM version: 1.8.0 121.

11

Page 12: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

The data and code that actually conducted thisanalysis are published in the companion GitHubrepository3 of the article. The ContainAnt imple-mentation is deterministic, and the repository bun-dles a convenient build script, allowing anyone to ex-ecute the same analysis and replicate/duplicate ourresults.

The performance of the heuristics was comparedon three variables:

1. The mean quality (avg) achieved by thebest of run solution returned by the heuristic,averaged over all runs.

2. The optimum quality (max ) achieved by thebest of run solution returned by the heuristic,taken over all runs.

3. The variance4 (var) of the quality achieved bythe best of run solutions, taken over all runs.

The significance of the differences between theperformance of the top heuristics is checked usingthe nonparametric protocol of Wineberg and Chris-tensen [28]. The final p-values are reported in Ta-ble 1.

Each experiment is performed with a fixed numberof runs (that number depending on the case study,as explained in the respective subsections). Our goalis to pick the technique that achieves the solution ofthe highest quality possible, given a single run witha fixed budget of “computational effort”. To ensurefair comparison, we need to limit the number of ob-jective function evaluations identically for all heuris-tics. For the constructive heuristics (random searchand GrAnt), this can easily be achieved by cappingthe number of iterations. For GrEvo, the numberof evaluations depends only on the population sizeand the number of generations, allowing us to limitthe number of evaluations by capping the product ofthese two parameters. SMAC has a mechanism forimposing this cap directly via the configuration file.

3 https://github.com/zaklogician/ContainAnt4Important for on-line optimization, where the heuristics

will be run a large number of times. A technique with highmean but low variance may well lose out to another techniquewith lower mean but high variance over a large number of runs.

The GrEvo heuristic has some tunable (hyper)-parameters, including population size and the num-ber of generations. We hand-selected the best-performing ratio of these parameters from the set{(100 : 10), (40 : 25), (25 : 40), (10 : 100)} separatelyfor each case study. ContainAnt is capable oftuning the hyper-parameters of its own heuristics.In principle ContainAnt could be used as its ownhyper-heuristic to self-improve GrEvo. We experi-mented with these capabilities during the early daysof development. However, we abandoned this avenueonce evidence emerged that significant improvementto these parameters would not be possible within theconstraints of the case studies (see the paragraph ded-icated to GrEvo below).

Table 1 summarizes the results achieved by SMACand the ContainAnt heuristics on all five case stud-ies presented above.

GrAnt

The GrAnt heuristic significantly outperformed allothers in the majority of experiments. The onlyexception is the syntax highlighting study, whereGrEvo systematically had the highest nominalmean. However, hypothesis testing reveals that thedifferences are not significant. GrAnt is the onlyheuristic to perform equally well across both com-binatorial optimization and algorithm configurationproblems, and the only one to find globally optimalsolutions to both the Branin function and both subsetsum instances.

GrEvo

The poor performance of the GrEvo heuristic, con-sistent across parameter settings, is crying out foran explanation. Our investigation suggests that themain culprit may be early loss of diversity (visiblein the Skiplist study, where the algorithm convergesin a mere five generations), caused by the fact thatthe first few elements of the genome have a dis-proportionately high influence on the phenotype inGrammatical Evolution [20]. Increasing the popula-tion size is not possible without moving beyond thestrict computational bounds of our case studies, ren-

12

Page 13: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

Table 1: Performance of SMAC, ContainAnt heuristics and random search on the five case studies.

Branin: GrAnt GrEvo Rand. SMACmax: 2.48 1.55 2.45 2.48avg: 1.80 0.87 1.37 1.47var: 0.24 0.08 0.32 0.35p: <.001Subset Sum P02: GrAnt GrEvo Rand. SMACmax: 2.00 1.00 1.00 -avg: 0.91 0.65 0.03 -var: 0.45 0.22 0.02 -p: <.001Subset Sum P03: GrAnt GrEvo Rand. SMACmax: 2.00 0.00 0.00 -avg: 0.38 0.00 0.00 -var: 0.62 0.00 0.00 -p: <.001DHeap: GrAnt GrEvo Rand. SMACmax: 46801 46801 46801 -avg: 46801 46752 46594 -var: 0 10671 52919 -p: 0.168Skiplist: GrAnt GrEvo Rand. SMACmax: 0.33 0.25 0.27 -avg: 0.28 0.25 0.25 -var: 0.01 0.00 0.00 -p: <.001Syntax H. Blue: GrAnt GrEvo Rand. SMACmax: 37.82 38.02 37.57 -avg: 34.36 34.50 32.18 -var: 5.85 8.73 7.46 -p: 0.663Syntax H. Yellow: GrAnt GrEvo Rand. SMACmax: 34.92 34.68 33.85 34.44avg: 31.51 32.33 29.22 30.92var: 4.02 3.30 5.67 5.22p: 0.082

dering Grammatical Evolution unsuitable for manyreal-time applications. Solving this issue could be anavenue of further research.

SMAC

As expected, the quality of the results returnedby SMAC significantly outperformed random searchin all cases. However, the average quality lingered be-neath that of GrAnt in the case of the Branin func-tion (although the best solution for the Branin func-

13

Page 14: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

tion was globally optimal) and beneath both Con-tainAnt heuristics in the algorithm configurationcase. Another major issue is speed: SMAC spendsover four hours on latter problem, while the Con-tainAnt heuristics finish both in 46 seconds.

6 Conclusion

Dependency Injection can be used to improve theexisting weaver-based Programming by Optimizationtools. We have described a library that implementsseveral grammatical optimization metaheuristics, in-cluding a novel Ant Programming approach. The li-brary provides better support for Programming byOptimization than specialized language extensionsand weaver tools, while doing away with several lim-itations such as difficulties with on-line optimization.

Furthermore, regarding Dependency Injection asan instance of a grammatical optimization problemleads to a whole new class of heuristics for automaticalgorithm configuration. The proposed grammaticalAnt Programming heuristic GrAnt significantly out-performs existing algorithms on five problems of in-terest, in one case reducing a four hour long SMACoptimization task to 46 seconds while significantlyimproving on the solution quality.

Programming by Optimization libraries can act asdrop-in replacement for existing Dependency Injec-tion containers, making PbO immediately applicableto a large number of enterprise software projects. Thedevelopment of ContainAnt in this direction is apromising target of future work.

References

[1] Carlos Ansotegui, Meinolf Sellmann, and KevinTierney. A Gender-based Genetic Algorithm forthe Automatic Configuration of Algorithms. InProceedings of the 15th International Conferenceon Principles and Practice of Constraint Pro-gramming, CP’09, pages 142–157, Berlin, Hei-delberg, 2009. Springer-Verlag.

[2] J. Burkardt. Data for the Subset Sum Problem.Available at https://people.sc.fsu.edu/

~jburkardt/datasets/subset_sum/subset_

sum.html, 2013. Accessed April 12, 2017.

[3] N. Burles, E. Bowles, B. R. Bruce, and K. Sriv-isut. Specialising Guava’s Cache to Reduce En-ergy Consumption. In Search-Based SoftwareEngineering - 7th International Symposium, SS-BSE 2015, Bergamo, Italy, September 5-7, 2015,Proceedings, pages 276–281, 2015.

[4] Nathan Burles, Jerry Swan, Edward Bowles,Alexander E.I. Brownlee, Zoltan A. Kocsis, andNadarajen Veerapen. Embedded Dynamic Im-provement. In Proceedings of the Compan-ion Publication of the 2015 Annual Confer-ence on Genetic and Evolutionary Computation,GECCO Companion ’15, pages 831–832, NewYork, NY, USA, 2015. ACM.

[5] L. C. W. Dixon and G. P. Szego. The global opti-mization problem: an introduction. In L. C. W.Dixon and G. P. Szego, editors, Towards GlobalOptimisation, volume 2. North Holland, Amster-dam, The Netherlands, 1978.

[6] M. Forsberg and A. Ranta. The Labelled B.N. F. Grammar Formalism. Technical report,Chalmers University of Technology, Gothen-burg, Sweden, 02 2005.

[7] M. Fowler. Inversion of Control Contain-ers and the Dependency Injection pattern.http://martinfowler.com/articles/injection.html,retr. 10 April, 2015.

[8] Mark Harman, Yue Jia, William B. Langdon,Justyna Petke, Iman Hemati Moghadam, ShinYoo, and Fan Wu. Genetic Improvement forAdaptive Software Engineering (Keynote). InProceedings of the 9th International Symposiumon Software Engineering for Adaptive and Self-Managing Systems, SEAMS 2014, pages 1–4,New York, NY, USA, 2014. ACM.

[9] Mark Harman, S. Afshin Mansouri, andYuanyuan Zhang. Search-based Software Engi-neering: Trends, Techniques and Applications.ACM Comput. Surv., 45(1):11:1–11:61, Decem-ber 2012.

14

Page 15: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

[10] Holger H. Hoos. Programming by optimization.Commun. ACM, 55(2):70–80, 2012.

[11] F. Hutter, H. H. Hoos, and K. Leyton-Brown.Sequential Model-Based Optimization for Gen-eral Algorithm Configuration. In Proc. of LION-5, page 507523, 2011.

[12] Frank Hutter, Holger H. Hoos, Kevin Leyton-Brown, and Thomas Stutzle. ParamILS: An Au-tomatic Algorithm Configuration Framework. J.Artif. Int. Res., 36(1):267–306, September 2009.

[13] C. Keber and M. G. Schuster. Option Valuationwith Generalized Ant Programming. In Proceed-ings of the 4th Annual Conference on Geneticand Evolutionary Computation, GECCO’02,San Francisco, CA, USA, 2002. Morgan Kauf-mann Inc.

[14] Manuel Lopez-Ibanez, Jeremie Dubois-Lacoste,Leslie Perez Caceres, Thomas Stutzle, andMauro Birattari. The irace package: Iteratedracing for automatic algorithm configuration.Operations Research Perspectives, 3:43–58, 2016.

[15] R. Motwani and P. Raghavan. Randomized Algo-rithms. Cambridge International Series on Par-allel Computation. Cambridge University Press,Cambridge, UK, 1995.

[16] U. Norell. Dependently Typed Programming inAgda. In Proceedings of the 4th InternationalWorkshop on Types in Language Design and Im-plementation, TLDI ’09, New York, NY, USA,2009. ACM.

[17] M. O’Neill and A. Brabazon. GrammaticalSwarm: The generation of programs by socialprogramming. Natural Computing, 5(4):443–462, 2006.

[18] D. R. Prasanna. Dependency Injection. ManningPublications, 1st edition, 2009.

[19] W. Pugh. Skip Lists: A Probabilistic Alterna-tive to Balanced Trees. Communications of theACM, 33(6):668–676, 1990.

[20] F. Rothlauf and M. Oetzel. On the Local-ity of Grammatical Evolution. In P. Col-let, M. Tomassini, M. Ebner, S. Gustafson,and A. Ekart, editors, Proceedings, GeneticProgramming: 9th European Conference (Eu-roGP 2006), Berlin, Heidelberg, 2006. Springer-Verlag.

[21] C. Ryan, J. J. Collins, and M. O’Neill. Gram-matical Evolution: Evolving Programs for anArbitrary Language. In W. Banzhaf, R Poli,M. Schoenauer, and T. C. Fogarty, editors, Pro-ceedings of the First European Workshop onGenetic Programming, volume 1391 of LNCS,Berlin, Germany, 1998. Springer-Verlag.

[22] A. Salehi-Abari and T. White. Enhanced Gen-eralized Ant Programming (EGAP). In Pro-ceedings of the 10th Annual Conference on Ge-netic and Evolutionary Computation, GECCO’08, New York, NY, USA, 2008. ACM.

[23] A. Salehi-Abari and T. White. The uphill battleof Ant Programming vs. Genetic Programming.In Proceedings of the International Joint Con-ference on Computational Intelligence (IJCCI),2009.

[24] T. Stutzle and H. H. Hoos. MAX-MIN AntSystem. Future Generation Computer Systems,2000.

[25] Jerry Swan and Nathan Burles. Genetic Pro-gramming: 18th European Conference, EuroGP2015, Copenhagen, Denmark, April 8-10, 2015,Proceedings, chapter Templar – A Framework forTemplate-Method Hyper-Heuristics, pages 205–216. Springer International Publishing, Cham,2015.

[26] Jerry Swan, Michael G. Epitropakis, andJohn R. Woodward. Gen-O-Fix: An embed-dable framework for Dynamic Adaptive GeneticImprovement Programming. (CSM-195):1–12,01/2014 2014.

[27] Jerry Swan, Krzysztof Krawiec, and Neil Ghani.Polytypic Genetic Programming. In Giovanni

15

Page 16: Dependency Injection for Programming by Optimization · Dependency Injection provides a superior alterna-tive to the traditional Programming by Optimization pipeline. We demonstrate

Squillero, editor, 20th European Conference onthe Applications of Evolutionary Computation,volume 10200 of LNCS, pages 66–81, Amster-dam, 19-21 April 2017. Springer.

[28] M. Wineberg and S. Christensen. StatisticalAnalysis for Evolutionary Computation: Intro-duction. In Proceedings of the 11th Annual Con-ference Companion on Genetic and EvolutionaryComputation Conference: Late Breaking Papers,

GECCO ’09, pages 2949–2976, New York, NY,USA, 2009. ACM.

[29] Kwaku Yeboah-Antwi and Benoit Baudry. Em-bedding Adaptivity in Software Systems Usingthe ECSELR Framework. In Proceedings of theCompanion Publication of the 2015 Annual Con-ference on Genetic and Evolutionary Computa-tion, GECCO Companion ’15, pages 839–844,New York, NY, USA, 2015. ACM.

16