Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind...

31
Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC

Transcript of Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind...

Page 1: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Adaptive Optimization in the Jalapeño JVM

Matthew Arnold

Stephen Fink

David Grove

Michael Hind

Peter F. Sweeney

Source: CS598 @ UIUC

Page 2: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Talk overview

Introduction: Background & Jalapeño JVM Adaptive Optimization System (AOS) Multi-level recompilation Miscellaneous issues Feedback-directed inlining Conclusion

Page 3: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Background

Three waves of JVMs:– First: Compile method when first encountered; use

fixed set of optimizations– Second: Determine hot methods dynamically and

compile them with more advanced optimizations– Third: Feedback-directed optimizations

Jalapeño JVM targets third wave, but current implementation is second wave

Page 4: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Jalapeño JVM

Written in Java (core services precompiled to native code in boot image)

Compiles at four levels: baseline, 0, 1, & 2 Compile-only strategy (no interpretation) Yield points for quasi-preemptive switching

Page 5: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Talk progress

Introduction: Background & Jalapeño JVM Adaptive Optimization System (AOS) Multi-level recompilation Miscellaneous issues Feedback-directed inlining Conclusion

Page 6: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Adaptive Optimization System

Page 7: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

AOS: Design

“Distributed, asynchronous, object-oriented design” useful for managing lots of data, say authors

Each successive pipeline (from raw data to compilation decisions) performs increasingly complex analysis on decreasing amounts of data

Page 8: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Talk progress

Introduction: Background & Jalapeño JVM Adaptive Optimization System (AOS) Multi-level recompilation Other issues Feedback-directed inlining Conclusion

Page 9: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation

Page 10: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation:Sampling

Sampling occurs on thread switch Thread switch triggered by clock interrupt Thread switch can occur only at yield points Yield points are method invocations and loop

back edges

Discussion: Is this approach biased?

Page 11: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation:Biased sampling

Code with no method calls or back edges

Short method

Long method

method call

method call

Page 12: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation: Cost-benefit analysis

Method m compiled at level i; estimate:– Ti, expected time program will spend executing m if

m not recompiled– Cj, the cost of recompiling m at optimization level j,

for i ≤ j ≤ N.– Tj, expected time program will spend executing

method m if m recompiled at level j.– If, for best j, Cj + Tj < Ti, recompile m at level j.

Page 13: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation: Cost-benefit analysis (continued)

Estimate Ti :

Ti = Tf * Pm

Tf is the future running time of the program

We estimate that the program will run for as long as it has run so far

Page 14: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation: Cost-benefit analysis (continued)

Pm is the percentage of Tf spent in m

Pm estimated from sampling Sample frequencies decay over time.

– Why is this a good idea?– Could it be a disadvantage in certain cases?

Page 15: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation: Cost-benefit analysis (continued)

Statically-measured speedups Si and Si used to determine Tj:

Tj = Ti * Si / Sj

– Statically-measured speedups?!– Is there any way to do better?

Page 16: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation: Cost-benefit analysis (continued)

Cj (cost of recompilation) estimated using a linear model of speed for each optimization level:

Cj = aj * size(m), where aj = constant for level j

Is it reasonable to assume a linear model? OK to use statically-determined aj?

Page 17: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation:Results

Page 18: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation:Results (continued)

Page 19: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation: Discussion

Adaptive multi-level compilation does better than JIT at any level in short term.

But in the long run, performance is slightly worse than JIT compilation.

The primary target is server applications, which tend to run for a long time.

Page 20: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Multi-level recompilation: Discussion (continued)

So what’s so great about Jalapeño’s AOS?

– Current AOS implementation gives good results for both short and long term – JIT compiler can’t do both cases well because optimization level is fixed.

– The AOS can be extended to support feedback-directed optimizations such as

fragment creation (i.e., Dynamo) determining if an optimization was effective

Page 21: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Talk progress

Introduction: Background & Jalapeño JVM Adaptive Optimization System (AOS) Multi-level recompilation Miscellaneous issues Feedback-directed inlining Conclusion

Page 22: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Miscellaneous issues:Multiprocessing

Authors say that if a processor is idle, recompilation can be done almost for free.

– Why almost for free?– Are there situations when you could get free

recompilation on a uniprocessor?

Page 23: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Miscellaneous issues:Models vs. heuristics

Authors moving toward “analytic model of program behavior” and elimination of ad-hoc tuning parameters.

Tuning parameters proved difficult because of “unforeseen differences in application behavior.”

Is it believable that ad-hoc parameters can be eliminated and replaced with models?

Page 24: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Miscellaneous issues:More intrusive optimizations

The future of Jalapeño is more intrusive optimizations, such as compiler-inserted instrumentation for profiling

Advantages and disadvantages compared with current system?

– Advantages: Performance gains in the long term Adjusts to phased behavior

– Disadvantages: Unlike with sampling, you can’t profile all the time Harder to adaptively throttle overhead

Page 25: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Miscellaneous:Stack frame rewriting

In the future, Jalapeño will support rewriting of a baseline stack frame with an optimized stack frame

Authors say that rewriting an optimized stack frame with an optimized stack frame is more difficult?– Why?

Page 26: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Talk progress

Introduction: Background & Jalapeño JVM Adaptive Optimization System (AOS) Multi-level recompilation Miscellaneous issues Feedback-directed inlining Conclusion

Page 27: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Feedback-directed inlining

Page 28: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Feedback-directed inlining: More cost-benefit analysis

Boost factor estimated:– Boost factor b is a function of

1. The fraction f of dynamic calls attributed to the call edge in the sampling-approximated call graph

2. Estimate s of the benefit (i.e., speedup) from eliminating virtually all calls from the program

– Presumably something like b = f * s.

Page 29: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Feedback-directed inlining: Results

Why?Why?

Page 30: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Talk progress

Introduction: Background & Jalapeño JVM Adaptive Optimization System (AOS) Multi-level recompilation Other issues Feedback-directed inlining Conclusion

Page 31: Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Source: CS598 @ UIUC.

Conclusion

AOS designed to support feedback-directed optimizations (third wave)

Current AOS implementation only supports selective optimizations (second wave)

– Improves short-term performance without hurting long term– Uses mix of cost-benefit model and ad-hoc methods.

Future work will use more intrusive performance monitoring (e.g., instrumentation for path profiling, checking that an optimization improved performance)