0.00
0 readers, 1 topic

Research: Implementing ART Just-In-Time (JIT) Compiler

Android 7.0 adds a just-in-time (JIT) compiler with code profiling to Android runtime (ART) that constantly improves the performance of Android apps as they run. The JIT compiler complements ART’s current ahead-of-time (AOT) compiler and improves runtime performance, saves storage space, and speeds app updates and system updates.

The JIT compiler also improves upon the AOT compiler by avoiding system slowdown during automatic application updates or recompilation of applications during OTAs. This feature should require minimal device integration on the part of manufacturers.

JIT and AOT use the same compiler with an almost identical set of optimizations. The generated code might not be the same but it depends. JIT makes uses of runtime type information and can do better inlining. Also, with JIT we sometimes do OSR compilation (on stack replacement) which will again generate a bit different code.

Architectural Overview
JIT architecture



Figure 1. JIT architecture – how it works

Flow

JIT compilation works in this manner:

1. The user runs the app, which then triggers ART to load the .dex file.
2. If the .oat file (the AOT binary for the .dex file) is available, ART uses them directly. Note that .oat files are generated regularly. However, that does not imply they contain compiled code (AOT binary).
3. If no .oat file is available, ART runs through either JIT or an interpreter to execute the .dex file. ART will always use the .oat files if available.
4. Otherwise, it will use the APK and extract it in memory to get to the .dex incurring a big memory overhead (equal to the size of the dex files).
JIT is enabled for any application that is not compiled according to the “speed” compilation filter (which says, compile as much as you can from the app).
5. The JIT profile data is dumped to a file in a system directory. Only the application has access to the directory.
6. The AOT compilation (dex2oat) daemon parses that file to drive its compilation.

Profile-guided comp


Figure 2. Profile-guided compilation

As a result, there’s no guarantee the snapshot taken when the application is in the background will contain the complete data (i.e. everything that was JITed).
There is no attempt to make sure we record everything as that will impact runtime performance.
Methods can be in three different states:
interpreted (dex code)
JIT compiled
AOT compiled
If both, JIT and AOT code exists (e.g. due to repeated de-optimizations), the JITed code will be preferred.
The memory requirement to run JIT without impacting foreground app performance depends upon the app in question. Large apps will require more memory than small apps. In general, big apps stabilize around 4 MB.