• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

Introducing the new Android Runtime - ART

EarlyMon

The PearlyMon
Jun 10, 2010
57,583
70,387
New Mexico, USA
Ever since the news just prior to the Nexus 5 release that Google had been working on acquiring a company (FlexyCore) specializing in precompiling and optimizing apps, there's been a lot of myths and fantasies floating around about Dalvik changes in Android.

Turns out, there's a bit of an Easter Egg hiding in Developer Options - ART -

http://gigaom.com/2013/11/06/google...-potential-replacement-for-dalvik-in-android/

It's experimental and incomplete but it's there to play with if you like.

http://source.android.com/devices/tech/dalvik/art.html

I recommend a good backup before tinkering, check out Helium Backup if you're not rooted.

Warning - this IS a developer option.

Enjoy. :)
 
To take full advantage of this, do you need to factory reset and reinstall apps. I am under the impression that ART works in part by doing something when the app installs, but this is a bit over my head. Probably won't be trying until I am sure how everything works normally on the N5, but I'm curious. Then again, this phone is so damn fast I'm not sure I would notice the difference. ;)
 
  • Like
Reactions: sandipagt1975
Upvote 0
To take full advantage of this, do you need to factory reset and reinstall apps. I am under the impression that ART works in part by doing something when the app installs, but this is a bit over my head. Probably won't be trying until I am sure how everything works normally on the N5, but I'm curious. Then again, this phone is so damn fast I'm not sure I would notice the difference. ;)

When the Dalvik is clear, it updates at boot up.

According to the Android Police article, ART works the same way; they warned that boot up could take some ten minutes to compile your existing apps.

Thereafter, apps are compiled when installed, just as the Dalvik cache is updated when apps are installed.

Based on that, a factory data reset ought not be required.
 
Upvote 0
When the Dalvik is clear, it updates at boot up.

According to the Android Police article, ART works the same way; they warned that boot up could take some ten minutes to compile your existing apps.

Thereafter, apps are compiled when installed, just as the Dalvik cache is updated when apps are installed.

Based on that, a factory data reset ought not be required.

I've been running it for 2 days; no issues so far. Correct, factory reset is not required. It just needs to reboot. It goes through its "Android is updating" routine. From reboot, to up and running, it took 8 minutes.
 
Upvote 0
Been running it for about 24 hours now.

Haven't had any issues yet, but I've read of greenify and titanium backup incompatibilities.

Same goes for custom kernels. If your a tinkerer, its probably best to leave it be until devs get more time with it.

Its been reported this has been worked on over the last 2 years, so is it safe to assume it isn't something flexycore was in on (at least until recently)?
 
Upvote 0
Google engineers have been in talks with FlexyCore people for the last year.

Google said their engineers did this - thanks to the acquisition, FlexyCore's engineers are Google engineers.

FlexyCore stated in 2010 that they were compiling apps into native code (and demonstrated it, I can post a YouTube on that), and that their target was the OEM level, not end users, for revenue.

I could be wrong, but I don't see how this wouldn't be at least a piece of FlexyCore's DroidBooster.
 
Upvote 0
Google engineers have been in talks with FlexyCore people for the last year.

Google said their engineers did this - thanks to the acquisition, FlexyCore's engineers are Google engineers.

FlexyCore stated in 2010 that they were compiling apps into native code (and demonstrated it, I can post a YouTube on that), and that their target was the OEM level, not end users, for revenue.

I could be wrong, but I don't see how this wouldn't be at least a piece of FlexyCore's DroidBooster.


Certainly sounds like it fits the bill
 
  • Like
Reactions: EarlyMon
Upvote 0
Indeed there is a limit to any optimization. And some stuff just has to be done at runtime.

I wonder if some of the aggressive compression AAPT does could be one place they might be able to gain performance. i.e. I wonder if another one of the optimizations might be uncompressing the resources like PNGs to the bitmap format the device's screen supports like ARGB_8888/ALPHA_8/RGB_565 etc.
 
  • Like
Reactions: EarlyMon
Upvote 0
Indeed.

It makes sense that this would save time loading apps.

Prior to this, Google's position has been that, aside from anything computationally intensive, native code holds no advantage over Java.

Source for that, the NDK, emphasis from Google -

http://developer.android.com/tools/sdk/ndk/index.html

Before downloading the NDK, you should understand that the NDK will not benefit most apps. As a developer, you need to balance its benefits against its drawbacks. Notably, using native code on Android generally does not result in a noticable performance improvement, but it always increases your app complexity. In general, you should only use the NDK if it is essential to your app
 
Upvote 0
It only takes a reboot to go back and forth, so if you've got anything specific you'd like compared I'll see what I can do :)

Please try your favorite monitor - something like Quick System Info Pro will do - and let's see if anything stands out at a zero-level check.

In QSIP, Processes tab, note some memory sizes of favorite apps run both ways (make sure Show Memory Usage is checked under Preferences from that tab/page).

QSIP also allows you to place a memory-used notification in your task bar and start that at boot-up - see Preferences from the main page, Task Killer Notification.

Check memory free after a reboot and some reasonable settling time - you decide, 10 minutes maybe - for both configurations.

Final point is anecdotal, but check that memory pull down while trying to run some things in the same way (not terribly important for a first look - refresh gmail, visit our home page and search for something in the Play Store for example).

That will tell us two things on a very rough sampling - individual app memory used in a few cases and free memory while running.
 
  • Like
Reactions: alostpacket
Upvote 0
Please try your favorite monitor - something like Quick System Info Pro will do - and let's see if anything stands out at a zero-level check.

In QSIP, Processes tab, note some memory sizes of favorite apps run both ways (make sure Show Memory Usage is checked under Preferences from that tab/page).

QSIP also allows you to place a memory-used notification in your task bar and start that at boot-up - see Preferences from the main page, Task Killer Notification.

Check memory free after a reboot and some reasonable settling time - you decide, 10 minutes maybe - for both configurations.

Final point is anecdotal, but check that memory pull down while trying to run some things in the same way (not terribly important for a first look - refresh gmail, visit our home page and search for something in the Play Store for example).

That will tell us two things on a very rough sampling - individual app memory used in a few cases and free memory while running.
I'll have some free time to tinker tomorrow AM, I'll do that and get back to you then. :thumbup:
 
Upvote 0
Has anyone done this yet? I'm reading how it helps speed things up over using Dalvik. Still don't really know what it does. Just did it on my Gnex running Shiny's 4.4 ROM.
To do it, head into Developer Options and swap over Dalvik to ART. Some are reporting FC's though so tread softly.

Edit: Heads up, if you have a lot of apps and you enable it, the reboot takes a LONG time. It's not bootlooping, just swapping all the cache over.

2nd Edit: Some reading here:
http://www.androidpolice.com/2013/1...-in-secret-for-over-2-years-debuts-in-kitkat/
(FYI, I have 172 apps and it's coming up on 10min to optimize them and still not done)
 
Upvote 0
Indeed.

It makes sense that this would save time loading apps.

Prior to this, Google's position has been that, aside from anything computationally intensive, native code holds no advantage over Java.

Source for that, the NDK, emphasis from Google -

Android NDK | Android Developers



FlexyCore said that they performed optimizations beyond just compiling and Google says that this is a work in progress.

Certainly interesting to see what other optimizations are coming.

I'd also love to see some real memory profiling of the difference here so far.

There seems to be an assumption that the new compiled code will be more compact, reducing the memory footprint.

There are a number of ways to compile code, I don't think that the assumption is automatically true.

As you point out, it's easy to conceive of an optimization that trades space for speed.

Other classic examples are loop unrolling and vectorization.

We know it's not a simple case of precompiling to get the same output as the JIT compiler as evidenced by the earlier report that some games didn't work (yet) under this scheme.

We also need to look at the future impact here.

Pretty clear that 64-bit Android is coming.

Java in a virtual machine is immune to 32/64-bit differences, it's up to the virtual machine to sort that out.

Whether this approach has the same immunity remains to be seen.

Likewise for backup restoration to a new phone architecture.

Interesting point about the NDK! Also I could have sworn I read the JNI overhead was not insubstantial. That was the reason I thought to avoid JNI (that, and much of what is thought to be computationally expensive is already translated to a native implementation by the framework/VM itself).

And definitely the loops are translated by the VM to a native implementation -- it would be interesting to see how this gets optimized. This could certainly gain some performance for large loops (100+ iterations). Could be interesting too, if lambdas come to Android's flavor of Java.

For non-game apps the two biggest performance drags I have noticed are layout and image rendering.

Image optimizations could better prepare the data for something like skia to send it to the surface flinger, but I'm not sure how they would optimize layout. A lot of that happens because of Android's dynamic nature requires several measurement passes through the view tree. I think this is something they improved behind the scenes in 4.3.

I haven't done memory profiling on any apps I have developed in awhile. But I seem to remember that image decoding is performed on the system heap, but still counts against the app's memory usage limit. I'm curious what the optimization would be here.

Another choke point seems to be reflection/method invocation and object access. Then of course there is file IO and Network IO. Network IO has been aided by libraries like Volley, and hopefully more stuff like that will come along. But I suspect platform specific optimizations could certainly help here, though I'm not sure exactly how.


This is certainly an interesting thread though -- even though much of it happens beyond my little sphere of app-dev specific knowledge. :)
 
  • Like
Reactions: EarlyMon
Upvote 0
Interesting point about the NDK! Also I could have sworn I read the JNI overhead was not insubstantial. That was the reason I thought to avoid JNI (that, and much of what is thought to be computationally expensive is already translated to a native implementation by the framework/VM itself).

Ah. If it helps, you may recall discussion in another thread where I said that context switching needed to be accounted for when comparing native to Java.

Probably you know this but just to be complete, JNI calling is a specific instance of the operating class, context switching.

That's definitely a boundary layer for optimization, you're absolutely correct on that. (What else is new lol.)

I don't know of any generic ways to predict the runtime cost because the user impact for common apps may have to include how often it's jumping back and forth vs time spent on the other side, and assuming best design practices and best methods, I think that still has to vary by app type.

PS - ever since the Snapdragon S4, there's been a separate grid computing element in the processor directly given over to fast UI operations.

I have no idea if it's really being exploited as that's processor specific, something that's usually avoided.

I doubt many people have heard of it.

~~~~~~~

For our curious non-programmers, for an interesting look at some Java vs native differences, check out "CF-Bench"

https://play.google.com/store/apps/details?id=eu.chainfire.cfbench

And for those unfamiliar, note well chainfire's warning that it's not intended to act like benchmarks you may know, the end score is just a number. ;) :)
 
  • Like
Reactions: alostpacket
Upvote 0
Been meaning to mention this for a while -

I think that one thing that will make or break this is garbage collection (translation: reclaiming previously used memory that is free for general use).

Java is very good at it and improvements there was part of what made Gingerbread great.


And I was mistaken earlier, JIT compiling was announced at Google I/O 2010 and introduced in Froyo.


I don't know about anyone else but I really miss those official videos for new Android revisions.
 
Upvote 0

BEST TECH IN 2023

We've been tracking upcoming products and ranking the best tech since 2007. Thanks for trusting our opinion: we get rewarded through affiliate links that earn us a commission and we invite you to learn more about us.

Smartphones