• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

Underlying tech in the Evo 3D - qHD, 3D, dual-core SMP

The Exynos is using a newer ARM A9 chip vs the A8 in the dual core snapdragon. The A9 is more efficient clock per clock than the A8. Qualcomm was banking on the higher clock speed of their A8 to compete with the more efficient A9 chips like Tegra 2 and Exynos which were clocked at 1 Ghz.

Except the Qualcomm 8660 is not A8. It's Qualcomm's own modified ARM architecture, Scorpion. Now if A9 & Scorpion are both running 1.2GHz, the A9 may still be slightly more powerful, but it's not as clearcut as A8 v A9. And processor power aside, it sounds like S II has GPS issues along with other QC problems. Who knows, the 3D/Sensation may have their own QC issues, but unfortunately we won't know until they are released (hopefully soon).

Per Anandtech: "The third contender in 2011 is Qualcomm
 
Upvote 0
Except the Qualcomm 8660 is not A8. It's Qualcomm's own modified ARM architecture, Scorpion. Now if A9 & Scorpion are both running 1.2GHz, the A9 may still be slightly more powerful, but it's not as clearcut as A8 v A9. And processor power aside, it sounds like S II has GPS issues along with other QC problems. Who knows, the 3D/Sensation may have their own QC issues, but unfortunately we won't know until they are released (hopefully soon).

Per Anandtech: "The third contender in 2011 is Qualcomm’s Scorpion core. Scorpion is a dual-issue, mostly in-order microprocessor architecture developed entirely by Qualcomm. The Scorpion core implements the same ARMv7-A instruction set as the Cortex A8 and A9, however the CPU is not based on ARM’s Cortex A8 or A9. This is the point many seem to be confused about. Despite high level similarities, the Scorpion core is not Qualcomm’s implementation of a Cortex A8. Qualcomm holds an ARM architecture license which allows it to produce microprocessors that implement an ARM instruction set. This is akin to AMD holding an x86 license that allows it to produce microprocessors that are binary compatible with Intel CPUs. However calling AMD’s Phenom II a version of Intel’s Core i7 would be incorrect. Just like calling Scorpion a Cortex A8 is incorrect."

You're right, my fault.

Still, the scorpion architecture hasn't changed much because of the way Qualcomm licenses the technology. It's also clear that the A9 processors are still going to be more efficient, generally speaking, clock-per-clock, than the 8660.

I still stand by my conclusion that the Exynos is going to be a more powerful application processor than the dual core snapdragon.

However, I think it will be more competitive with something like the Tegra 2.
 
Upvote 0
Still, the scorpion architecture hasn't changed much because of the way Qualcomm licenses the technology.

Those are different things entirely.

Qualcomm is fabless for their SoC line - they license others to build them.

The Snapdragon has seen steady improvement over its lifetime, and the Scorpion architecture is concealed by corporate security, so I don't know how you can say it hasn't changed much. Even the stuff that revealed the details of the original architecture seem to be missing from the common references I used publicly for it just last year.

However, I think it will be more competitive than something like the Tegra 2.

ftfy ;) :)
 
Upvote 0
Well, if this thing isn't out within a month I am SOL. My phone on big red will be lucky to see 5 more weeks of use before it dies. If there isn't the Sensation, Kingdom, or 3D out by this time next month plus or minus a week, I may have to stick with big red. :mad:
Just get an EVO 4G, buy yourself 30 days, and pick up the 3D when it launches. OR... just get a crappy flip phone activated for the short while you would need it. Don't give up on the EVO 3D over a measly month or two!
 
Upvote 0
I concur. The 3D aspect of the phone is definitely more gimmicky than most of the other features of the phone. That aspect provides a wow factor easily seen. But the true wow factor for me is the muscle behind it. I can't me having the same attitude for this phone as I do for my 21 month old Pre in 22 months, either.

I just want it already. I don't know of any other Sprint devices coming that appeal to me as much, and I don't know how long I can resist turninig my Pre into a human launched projectile...
That is Exactly how i feel about my Pre now i am just tired of it
 
Upvote 0
You're right, my fault.

Still, the scorpion architecture hasn't changed much because of the way Qualcomm licenses the technology. It's also clear that the A9 processors are still going to be more efficient, generally speaking, clock-per-clock, than the 8660.

I still stand by my conclusion that the Exynos is going to be a more powerful application processor than the dual core snapdragon.

However, I think it will be more competitive with something like the Tegra 2.

Interesting you say that. I've seen the Exynos head to head against the Tegra 2, and the Tegra edged it out pretty well. Don't underestimate the power of the TEGRAZONE!!!! ;)
 
  • Like
Reactions: EarlyMon
Upvote 0
This is from the SGS2 forum and so excellent, I have to share it:

I've already e-mailed the developer regarding this, nothing against the guy, he just needs update the benchmark.

It's ment to be Smartbench 2011, it doesnt seem very smart considering other much more demanding graphics benchmarks like GLBenchmark 2.03 and Electopia clearly show Mali 400MP is allot faster than the SGX540.

My apology for not responding sooner - I was on vacation so I was mostly ignoring any e-mails. :D

I suppose there are some truth in what you are saying. When I built the GPU tests for Smartbench, I picked sets of OpenGL functions that I would use if I had to build 3D apps/games. OpenGL is a powerful API so there's always more than 1 way to implement same set of scenes/animations. Some will turn out to be more efficient than others. I'm not sure how mine would stand against others. But I did put a lot of effort though - this wasn't done over few weeks.

One thing I do want to point out though, is that although scenes do look simpler in Smartbench compare to the likes of GLBenchmark's Egypt, etc, it does contain a lot of polygons that are not visible to the naked eyes. Each walls you see are split into many smaller polygons and I've used up the maximum number of light sources Android's version of OpenGL can handle. What I'm trying to say, is that (mainly due to my own lack of graphical skills), I went more brute force way - doesn't look pretty, but I made sure there were enough objects that it will not go close to 60fps on any devices. Does it sound like a techie trying to create a demanding 3D scenes, yes. :D But I can assure you, they are very taxing to the GPUs, more so than it appears. Perhaps APIs I used aren't as efficient on Mali as other API calls but if that was the case, then the apps I build also would not execute any faster. Having said that, once I get my hands on the SGSII, I will look further into it. I am rather intrigued by "MP" part of Mali.

I have experiences writing codes for server environments, desktop and now phones but I'm no graphics designer, and until my little daughter grows up and start producing nice graphics pieces for the apps I write, I have no choice. :D

I don't mind taking criticisms though - it is usually the best source to find out best places to improve my apps, so please do continue to point them out.

Translation - the graphics benchmark in Smartbench is a stress test.

Yes, devs can optimize apps. I'll bet most of you already own games whose performance is all over the place - because while devs can optimize apps, they'll very often simply use the graphics library functions that look right to them as coders using any number of ideas, and expect the graphics API (application program interface) to handle their needs through hardware.

Takeaway - if you want a benchmark that stresses things to the max before spitting out numbers, looks like Smartbench will do nicely.

Yes - this is a turnaround on my position from before - and it isn't: now we can correlate the results to a meaning, and I've always complained about benchmarks putting out uncorrelated numbers.
 
  • Like
Reactions: novox77
Upvote 0
I don't think I asked the question correctly. I meant for things like web browsing and general usage. Does having hardware support for 3d impact how well the screen handles 2d (not just processing power but sharpness; tonal range an so forth).

I'm not too sure of the E3D itself but I work in a TV department and I can say that with 3D TV's, It is actually the specs of the 2D that make the quality of the 3D.
For example-An led tv 1080p 240hz w/3D versus a plasma tv 720p 600hz w/3D. Obviously the led tv was alot more clear in both 2D and 3D aspects. If this were to to be carried over to the E3D, then I'd say no, it wouldn't.
 
Upvote 0
I thought the jist of GCD was that apps would NOT necessarily have to be rewritten to take advantage (albeit not complete advantage) of multi-core. Isn't that the idea behind it, just a hardware flag in the app or something to that effect, though I doubt this is full optimization as would be the core services which are undoubtedly rewritten.

I'm totally familiar with that theory. People have made those claims about software scalability since just after the IBM 704. The claims have been known to come true, fairly often, some would say.

So - maybe I'm wrong - let's test it.

My surveys of the iPad forums tell me that while all iPad apps are there for the iPad2, many users are reporting no change in behavior for many apps with some complaints of sluggishness, same as before. Some blogs report which apps will run better to show off the iPad2.

But - I've probably not looked for the past 2 weeks, maybe even 3.

I could be wrong - but I do believe the Apple forums were busy with folks discussing whether there would dual-core only apps.


So - yes, iOS apps will run as is. If there are issues, perhaps a rebuild will do.

GCD is very cool - it has to be.

My takeaway has been that apps benefit from dual-core re-design under iOS, not the same as under Android - if I'm wrong, I'll eat tasty crow, with chipotle bbq sauce, yum yum.

But I don't think I am.
 
Upvote 0
I'm totally familiar with that theory. People have made those claims about software scalability since just after the IBM 704. The claims have been known to come true, fairly often, some would say.

So - maybe I'm wrong - let's test it.

My surveys of the iPad forums tell me that while all iPad apps are there for the iPad2, many users are reporting no change in behavior for many apps with some complaints of sluggishness, same as before. Some blogs report which apps will run better to show off the iPad2.
That doesn't sound very scientific.. :p

I could be wrong - but I do believe the Apple forums were busy with folks discussing whether there would dual-core only apps.
I think they can compile as universal w/ a flag in the header (or whatever you call it in software dev speak). It would be folly for Apple to allow the market to be fragmented in such a way. And you know how close Apple likes to keep their flock of sheep. :D

My takeaway has been that apps benefit from dual-core re-design under iOS
I think to an extent this is true.
not the same as under Android
This actually surprised me. Bravo to Android underpinnings if it is indeed true. And I have no reason to doubt you sir. You probably have at least 5 years before dementia sets in.:eek:
 
Upvote 0
I think they can compile as universal w/ a flag in the header (or whatever you call it in software dev speak). It would be folly for Apple to allow the market to be fragmented in such a way. And you know how close Apple likes to keep their flock of sheep. :D

This actually surprised me. Bravo to Android underpinnings if it is indeed true. And I have no reason to doubt you sir. You probably have at least 5 years before dementia sets in.:eek:

1. Universal compiles lead to more bloated apps in the cases I've seen on computers (so, I'd assume the same true of mobile).

2. Android underpinnings are described accurately.

3. I think I'll enter early dementia if I'm going to be quoted out of context! :p :D

Plus, I still need to check out the new vid from Gary - can't wait!
 
  • Like
Reactions: GaryColeman
Upvote 0
1. True, but moot.

You're right, I apologize - and I apologize because I can see from your response how that sounded: it must sound like I'm finding excuses to throw rocks because bloat wasn't the subject, performance was.

Never my intent to behave that way.

I went off on a tangent, trying to remember what I'd read of GCD earlier - and couldn't and didn't want to cheat and just google, because that's not discussing what I know - and I just finished a _large_ build yesterday that was processor-portable, and we discussing our own bloat - and I remembered something in here I wrote about app size - so that slipped out without me thinking about it.

My posts - like this one - are very OCD sometimes, but my thinking on that wasn't - just normal mapping to the solution space, something I do on auto-pilot.

So, I don't have an excuse to have appeared to throw rocks, but there's the explanation how it happened - if it matters.

~~~~~~

My issue that sent me on this tangent with my unfair statement:


  • I know iOS apps don't need re-compiling to run iPad/iPad2.
  • Does Grand Central rely on an compiler switch?
    • It need not. It could be portable code by default and then just act based on sensing the processor.
      • For that to be so, GCD would have to have been dual-core ready for some time. Was it? (Might sound like a loaded question, it's not - it's just a question.)
      • If GCD didn't have that support before, then conceivably all that apps would require would be a rebuild against the new GCD. In such a case, whether hardware sensing or compiler switch becomes a moot point - a rebuild is a rebuild.
  • Ultimately, compiler switch or hardware sensing doesn't matter.

Because all services are embedded ... you see where I'm headed there? Supportability and sustainability again - before you see the dual-core advantage.

I think I have a solution to the question: we need an expert to chime in, I'm out of my depth on what Grand Central really does - and I don't think it's fair for me to just google, read and come back and spout an answer. Not my style for this level of tech. (I'd read up on Grand Central some time ago, fwiw. Reading != working.)

So - in my mod OCD ways, I hooked up with an iOS game dev who's fishing around to transition to Android and have had nice exchanges with him - he seems ok to me.

I'm going to drop him a line, ask him if he'd be interested in reviewing this stuff and see if he'll post to clarify the situation.

Because he's an expert according to me, anything he says that contradicts my claims will instantly trump what I've said, so we'll have no danger of arguing a priori.

And then, we'll know.

Meanwhile, note to other mods - please leave this in place for just a short while, I'll move it after contacting the iOS/Android dev - let's give him context for the target, then move. TIA for that.

Once fully clear if this heads anywhere, maybe we want to have an Android/iOS dual-core tech thread out of this in the Android Lounge or something like that.
 
  • Like
Reactions: hakujin
Upvote 0
The nature of the Linux operating system (that Android is built on) is to use memory to map to various hardware function interfaces and registers, including but not limited to the GPU, and the Snapdragon is free to allocate however much memory it wants before the operating system ever loads.

What's leftover is what the kernel knows about - ram available to operating system for its stuff and yours - and that's what it reports.

My year-old Evo shows I have 427.6 MB of RAM, despite an actual memory chip with 512 MB - so 85 MB is set aside right there, according to one program.

Some people feel cheated somehow when they learn that, but no need - it's no more of a cheat for the OS to use memory that way than any other - so it just is what it is.

As I write this, my phone has 136 MB free and seems quite responsive.

By the way, another program insists I only started out with 394 MB (but the same free memory reported as 136 MB as 112 MB really free, 24 MB lower limit before the operating system takes over and starts killing and cleaning).
 
  • Like
Reactions: marctronixx
Upvote 0
The nature of the Linux operating system (that Android is built on) is to use memory to map to various hardware function interfaces and registers, including but not limited to the GPU, and the Snapdragon is free to allocate however much memory it wants before the operating system ever loads.

What's leftover is what the kernel knows about - ram available to operating system for its stuff and yours - and that's what it reports.

My year-old Evo shows I have 427.6 MB of RAM, despite an actual memory chip with 512 MB - so 85 MB is set aside right there, according to one program.

Some people feel cheated somehow when they learn that, but no need - it's no more of a cheat for the OS to use memory that way than any other - so it just is what it is.

As I write this, my phone has 136 MB free and seems quite responsive.

By the way, another program insists I only started out with 394 MB (but the same free memory reported as 136 MB as 112 MB really free, 24 MB lower limit before the operating system takes over and starts killing and cleaning).

So in reality they should report what FREE memory will be available with the phones or at least say what amount of memory the GPU is going to take automatically. Otherwise what is the point of say 2GB memory if 1.75GB is going to be withheld by the GPU....Would give us a better idea on stuff no?
 
Upvote 0
Nope.

The price of running the hardware and operating system is the price.

It will vary depending on updates, too. Old PCs used to suck RAM mirroring BIOS or mapping graphics memory, no one's ever spec'd anything differently. This is really nothing new or different.

And you are the one using the memory, ultimately, so why care how the engineers did things, be they Qualcomm, Google, HTC or in other cases, Microsoft?

Lesson isn't monkey the spec, lesson is what you already know - more RAM is always better, regardless of computer size. And these things are like miniature laptops, amazing they do so much with so little.

Once upon a time we had floppy disks and 64k of memory. Specs for PC today sent back in time would make it sound like you could own the Library of Congress. But as tech advances, memory use goes up for the tech.

That's what I mean when I say the cost is the cost.
 
Upvote 0
Nope.

The price of running the hardware and operating system is the price.

It will vary depending on updates, too. Old PCs used to suck RAM mirroring BIOS or mapping graphics memory, no one's ever spec'd anything differently. This is really nothing new or different.

And you are the one using the memory, ultimately, so why care how the engineers did things, be they Qualcomm, Google, HTC or in other cases, Microsoft?

Lesson isn't monkey the spec, lesson is what you already know - more RAM is always better, regardless of computer size. And these things are like miniature laptops, amazing they do so much with so little.

Once upon a time we had floppy disks and 64k of memory. Specs for PC today sent back in time would make it sound like you could own the Library of Congress. But as tech advances, memory use goes up for the tech.

That's what I mean when I say the cost is the cost.

Agreed that more ram the better. But what im saying is whats the point of having say 1gb ram I the gpu is allocates to reserve/hog say 90% of that leaving the user with little left. I know thats unlikely to happen but it could and in turn effect ability to multiprocess stuff.
 
Upvote 0
Agreed that more ram the better. But what im saying is whats the point of having say 1gb ram I the gpu is allocates to reserve/hog say 90% of that leaving the user with little left. I know thats unlikely to happen but it could and in turn effect ability to multiprocess stuff.

I got what you were saying the first time. You're not alone - I've seen others make the same argument.

You're not evidently understanding what I already explained, and you're exaggerating to make a point that doesn't really exist.

How to bridge the gap?

I got your point - but it's based on a cartoon and a post that convinced you that lost 200 MB somewhere, maybe hogged by the GPU - and that never really happened.

They don't specify free memory because there's no one definition of free. Sense is dynamic as are the other included apps. Free the instant the phone boots up? Free after you've launched a bunch of Sense apps, and configured them differently for your use than the next guy?

They've doubled the ram on the 3vo vs. the Evo, and they've at least doubled the capability of the 3vo vs the Evo, and they've doubled the hardware/os reserved use of that ram to provide that capability. And they've easily doubled the free ram you had left over compared to your Evo.


And that reserved memory isn't just for the GPU, I've mentioned that already. Either way, there's reserved RAM for the architecture - just like the OS needs RAM, just like Sense needs ram.

Ultimately, you're robbed of nothing. They could re-design so there's no reserved ram - you'd have the same free memory left over and it would very probably run as slow as molasses.
 
Upvote 0
Considering how well Android manages RAM, particularly when more is needed, wouldn't it be safe to say that free RAM is wasted RAM?

Depends. Running close to the edge is like driving with a tank nearly empty - you can do it, but I wouldn't try to go fast like that.

Running apps can actually use ram like they're breathing - dynamicly sucking some up, dynamically giving it back. (In fact, improving that process is a cornerstone of Gingerbread.)

So, just off the cuff I'd say the larger free pool, the less Android will have to step in and manage things, and managing things means stealing processor time from your apps to manage the layout.

And a larger free pool may inspire some app designers to do things they couldn't before - could happen, dunno.

And if I'm wrong and some does seem wasted - who knows what future requirements will be? What comes in the 18 months after Ice Cream Sandwich? Pineapple Upside Cake? Tiramisu? What's all that going to take?
 
Upvote 0
realize many people will be using the dual camera capability of this gadget. the camera on the evo is ram intensive and battery intensive, along with GPS usage, so factor that X 2 for the 3D portion of this new gadget and additional RAM is welcomed.

not to mention one will have other items running alongside this, uploads, background services, etc etc, so there will be times the RAM will get a good workout. :D
 
  • Like
Reactions: EarlyMon
Upvote 0
I got what you were saying the first time. You're not alone - I've seen others make the same argument.

You're not evidently understanding what I already explained, and you're exaggerating to make a point that doesn't really exist.

How to bridge the gap?

I got your point - but it's based on a cartoon and a post that convinced you that lost 200 MB somewhere, maybe hogged by the GPU - and that never really happened.

They don't specify free memory because there's no one definition of free. Sense is dynamic as are the other included apps. Free the instant the phone boots up? Free after you've launched a bunch of Sense apps, and configured them differently for your use than the next guy?

They've doubled the ram on the 3vo vs. the Evo, and they've at least doubled the capability of the 3vo vs the Evo, and they've doubled the hardware/os reserved use of that ram to provide that capability. And they've easily doubled the free ram you had left over compared to your Evo.


And that reserved memory isn't just for the GPU, I've mentioned that already. Either way, there's reserved RAM for the architecture - just like the OS needs RAM, just like Sense needs ram.

Ultimately, you're robbed of nothing. They could re-design so there's no reserved ram - you'd have the same free memory left over and it would very probably run as slow as molasses.

And I get what your saying. I just wondered if they kept the same ratio/%'s when upgrading hardware and the likes. I would assume they would but had no clue. b/c if they didn't then it could be false marketing to say you have X free when n theory you could have a slug phone possibly.
I know im speaking extremes, just wondered if/why never seen #'s on % set asside for certain parts of the hardware ya know.
 
Upvote 0
And I get what your saying. I just wondered if they kept the same ratio/%'s when upgrading hardware and the likes. I would assume they would but had no clue. b/c if they didn't then it could be false marketing to say you have X free when n theory you could have a slug phone possibly.
I know im speaking extremes, just wondered if/why never seen #'s on % set asside for certain parts of the hardware ya know.

OK, maybe I only thought I understood what you were saying.

then it could be false marketing to say you have X free when n theory you could have a slug phone possibly
Are you saying that instead of 1 GB free you only have 800 MB free here?

If so - two things: first, they never spec'd free, second, are you counting free as anything free for Android (including user apps to use), after hardware and then coming to 800 GB?

The set-asides for the hardware are a function of Linux and the bootloader as well as the hardware.

Linux may set some aside on the top (in the case of use believing that display, it's 200 MB) - but it also always sets some aside on the bottom - there are set-asides at the bottom on the order of 24 MB and then another 30 MB (so your phone can't be running hog apps until it crashes and you can't get calls or can't recover from hog apps). In my mind, set-asides are set-asides, so it's all the same.

I think you'll find the top/hardware set-asides will vary by software update and roms (it even varies by the tool used to report it without changing anything else).

Maybe the simple answer is simple - you never see set aside here (or in desktop Linux, OS X, or Windows) because that's just an efficiency.

In cases where a GPU or other hardware absolutely going to "steal" memory to work, those are fixed numbers per revision.

If you look at older desktop machines using the Intel GMA 950 gpu, it would get an allocation after loading enough of the OS to support whatever resolution your monitor had. (And if you study that gpu, you know it's set aside is going to be 144 MB, minimum.) And if there were a video processor, it would get its own buffer, too.

Here's a quote from Apple's support website discussing the same issue:

Depending on the application or task being accomplished, Mac OS X may make additional main memory available to the graphics processor for texture use beyond the base amount mentioned above. The most common types of applications that request more system memory to be used as graphics memory are 3D and graphics-intense applications. Using an extended desktop or mirrored desktop may also increase the amount of system memory used as graphics memory.
I guess what I'm trying to say is that in the old days, functions were fixed, things could be clearly spec'd.

With GUI operating systems, things got entangled, then with dedicated GPUs, things got more entangled, then with "3D games" (not even getting to real 3D) things got more entangled still - and so forth.

Now - I say this in honesty, not sarcasm - maybe I'm so used to this, I'm drinking the Kool-Aide. Maybe I dispense it.

Programmers and system designers (that would be me, too) go like this: Look! RAM! Ooooh - shiny! Wonder how much I can use to provide this cool feature and impress users? (Yes - we literally think that way (although some try to hide it).)

How we got into this was coming from the same direction, and maybe this is semantics.

I haven't tinkered with either the Sensation or the 3vo - but I know things about operating systems and processor cores. And my gut told me right out that with all of the improvements, I wondered if the Sensation isn't running a bit close to the wind with only 768 MB, so I feel more comfortable with 1 GB.

Marc said it simply and best -

marctronixx said:
realize many people will be using the dual camera capability of this gadget. the camera on the evo is ram intensive and battery intensive, along with GPS usage, so factor that X 2 for the 3D portion of this new gadget and additional RAM is welcomed.

I don't think they don't know the set asides or are being dishonest about them - I think they're just specifying the minimum required based on bare metal.

The list of set-asides is likely to be large, and will vary based on user configuration (they can't control that), and OS changes on Google's side (they can't control that).

Set-asides are hidden in plain sight - whether it starts by telling by telling that the device has 800 MB ram, or says Android System uses x% of memory, or process probes say various shared libraries under the hood are using a, b, or c% or memory - so, in my mind, set-asides are set-asides, they're all the same.

In that picture above, the 400+ MB of "used memory" included other set-asides, lots of them.

Do we know if those set-asides are same, less or more than the initial 200 MB set-asides?

Honest question - do we care?
 
Upvote 0

BEST TECH IN 2023

We've been tracking upcoming products and ranking the best tech since 2007. Thanks for trusting our opinion: we get rewarded through affiliate links that earn us a commission and we invite you to learn more about us.

Smartphones