1. Are you ready for the Galaxy S20? Here is everything we know so far!

Underlying tech in the Evo 3D - qHD, 3D, dual-core SMP

Discussion in 'Android Devices' started by RichboyJhae, Apr 24, 2011.

  1. MelissaM

    MelissaM Newbie

    Except the Qualcomm 8660 is not A8. It's Qualcomm's own modified ARM architecture, Scorpion. Now if A9 & Scorpion are both running 1.2GHz, the A9 may still be slightly more powerful, but it's not as clearcut as A8 v A9. And processor power aside, it sounds like S II has GPS issues along with other QC problems. Who knows, the 3D/Sensation may have their own QC issues, but unfortunately we won't know until they are released (hopefully soon).

    Per Anandtech: "The third contender in 2011 is Qualcomm


  2. Jensen

    Jensen Member

    You're right, my fault.

    Still, the scorpion architecture hasn't changed much because of the way Qualcomm licenses the technology. It's also clear that the A9 processors are still going to be more efficient, generally speaking, clock-per-clock, than the 8660.

    I still stand by my conclusion that the Exynos is going to be a more powerful application processor than the dual core snapdragon.

    However, I think it will be more competitive with something like the Tegra 2.
  3. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    Those are different things entirely.

    Qualcomm is fabless for their SoC line - they license others to build them.

    The Snapdragon has seen steady improvement over its lifetime, and the Scorpion architecture is concealed by corporate security, so I don't know how you can say it hasn't changed much. Even the stuff that revealed the details of the original architecture seem to be missing from the common references I used publicly for it just last year.

    ftfy ;) :)
  4. lordofthereef

    lordofthereef Android Expert

    Just get an EVO 4G, buy yourself 30 days, and pick up the 3D when it launches. OR... just get a crappy flip phone activated for the short while you would need it. Don't give up on the EVO 3D over a measly month or two!
  5. spikey0626

    spikey0626 Lurker

    That is Exactly how i feel about my Pre now i am just tired of it
  6. RichboyJhae

    RichboyJhae Android Enthusiast
    Thread Starter

    Interesting you say that. I've seen the Exynos head to head against the Tegra 2, and the Tegra edged it out pretty well. Don't underestimate the power of the TEGRAZONE!!!! ;)
    EarlyMon likes this.
  7. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    This is from the SGS2 forum and so excellent, I have to share it:

    Translation - the graphics benchmark in Smartbench is a stress test.

    Yes, devs can optimize apps. I'll bet most of you already own games whose performance is all over the place - because while devs can optimize apps, they'll very often simply use the graphics library functions that look right to them as coders using any number of ideas, and expect the graphics API (application program interface) to handle their needs through hardware.

    Takeaway - if you want a benchmark that stresses things to the max before spitting out numbers, looks like Smartbench will do nicely.

    Yes - this is a turnaround on my position from before - and it isn't: now we can correlate the results to a meaning, and I've always complained about benchmarks putting out uncorrelated numbers.
    novox77 likes this.
  8. novox77

    novox77 Leeeroy Jennnkinnns!

    This is the type of info that should be included with the benchmark app so people can have a better appreciation of what's actually happening behind the scenes.
    EarlyMon likes this.
  9. rD PurpleHaze

    rD PurpleHaze Newbie

    I'm not too sure of the E3D itself but I work in a TV department and I can say that with 3D TV's, It is actually the specs of the 2D that make the quality of the 3D.
    For example-An led tv 1080p 240hz w/3D versus a plasma tv 720p 600hz w/3D. Obviously the led tv was alot more clear in both 2D and 3D aspects. If this were to to be carried over to the E3D, then I'd say no, it wouldn't.
  10. toronado455

    toronado455 Well-Known Member

    This is my question also. I'm thinking that will depend on how perfectly they can make that extra layer in the display become invisible when in 2D mode. I guess we will just need to wait and see.
  11. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    I'm totally familiar with that theory. People have made those claims about software scalability since just after the IBM 704. The claims have been known to come true, fairly often, some would say.

    So - maybe I'm wrong - let's test it.

    My surveys of the iPad forums tell me that while all iPad apps are there for the iPad2, many users are reporting no change in behavior for many apps with some complaints of sluggishness, same as before. Some blogs report which apps will run better to show off the iPad2.

    But - I've probably not looked for the past 2 weeks, maybe even 3.

    I could be wrong - but I do believe the Apple forums were busy with folks discussing whether there would dual-core only apps.

    So - yes, iOS apps will run as is. If there are issues, perhaps a rebuild will do.

    GCD is very cool - it has to be.

    My takeaway has been that apps benefit from dual-core re-design under iOS, not the same as under Android - if I'm wrong, I'll eat tasty crow, with chipotle bbq sauce, yum yum.

    But I don't think I am.
  12. hakujin

    hakujin Android Enthusiast

    That doesn't sound very scientific.. :p

    I think they can compile as universal w/ a flag in the header (or whatever you call it in software dev speak). It would be folly for Apple to allow the market to be fragmented in such a way. And you know how close Apple likes to keep their flock of sheep. :D

    I think to an extent this is true.
    This actually surprised me. Bravo to Android underpinnings if it is indeed true. And I have no reason to doubt you sir. You probably have at least 5 years before dementia sets in.:eek:
  13. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    1. Universal compiles lead to more bloated apps in the cases I've seen on computers (so, I'd assume the same true of mobile).

    2. Android underpinnings are described accurately.

    3. I think I'll enter early dementia if I'm going to be quoted out of context! :p :D

    Plus, I still need to check out the new vid from Gary - can't wait!
    GaryColeman likes this.
  14. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    You're right, I apologize - and I apologize because I can see from your response how that sounded: it must sound like I'm finding excuses to throw rocks because bloat wasn't the subject, performance was.

    Never my intent to behave that way.

    I went off on a tangent, trying to remember what I'd read of GCD earlier - and couldn't and didn't want to cheat and just google, because that's not discussing what I know - and I just finished a _large_ build yesterday that was processor-portable, and we discussing our own bloat - and I remembered something in here I wrote about app size - so that slipped out without me thinking about it.

    My posts - like this one - are very OCD sometimes, but my thinking on that wasn't - just normal mapping to the solution space, something I do on auto-pilot.

    So, I don't have an excuse to have appeared to throw rocks, but there's the explanation how it happened - if it matters.


    My issue that sent me on this tangent with my unfair statement:

    • I know iOS apps don't need re-compiling to run iPad/iPad2.
    • Does Grand Central rely on an compiler switch?
      • It need not. It could be portable code by default and then just act based on sensing the processor.
        • For that to be so, GCD would have to have been dual-core ready for some time. Was it? (Might sound like a loaded question, it's not - it's just a question.)
        • If GCD didn't have that support before, then conceivably all that apps would require would be a rebuild against the new GCD. In such a case, whether hardware sensing or compiler switch becomes a moot point - a rebuild is a rebuild.
    • Ultimately, compiler switch or hardware sensing doesn't matter.

    Because all services are embedded ... you see where I'm headed there? Supportability and sustainability again - before you see the dual-core advantage.

    I think I have a solution to the question: we need an expert to chime in, I'm out of my depth on what Grand Central really does - and I don't think it's fair for me to just google, read and come back and spout an answer. Not my style for this level of tech. (I'd read up on Grand Central some time ago, fwiw. Reading != working.)

    So - in my mod OCD ways, I hooked up with an iOS game dev who's fishing around to transition to Android and have had nice exchanges with him - he seems ok to me.

    I'm going to drop him a line, ask him if he'd be interested in reviewing this stuff and see if he'll post to clarify the situation.

    Because he's an expert according to me, anything he says that contradicts my claims will instantly trump what I've said, so we'll have no danger of arguing a priori.

    And then, we'll know.

    Meanwhile, note to other mods - please leave this in place for just a short while, I'll move it after contacting the iOS/Android dev - let's give him context for the target, then move. TIA for that.

    Once fully clear if this heads anywhere, maybe we want to have an Android/iOS dual-core tech thread out of this in the Android Lounge or something like that.
    hakujin likes this.
  15. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    The nature of the Linux operating system (that Android is built on) is to use memory to map to various hardware function interfaces and registers, including but not limited to the GPU, and the Snapdragon is free to allocate however much memory it wants before the operating system ever loads.

    What's leftover is what the kernel knows about - ram available to operating system for its stuff and yours - and that's what it reports.

    My year-old Evo shows I have 427.6 MB of RAM, despite an actual memory chip with 512 MB - so 85 MB is set aside right there, according to one program.

    Some people feel cheated somehow when they learn that, but no need - it's no more of a cheat for the OS to use memory that way than any other - so it just is what it is.

    As I write this, my phone has 136 MB free and seems quite responsive.

    By the way, another program insists I only started out with 394 MB (but the same free memory reported as 136 MB as 112 MB really free, 24 MB lower limit before the operating system takes over and starts killing and cleaning).
    marctronixx likes this.
  16. So in reality they should report what FREE memory will be available with the phones or at least say what amount of memory the GPU is going to take automatically. Otherwise what is the point of say 2GB memory if 1.75GB is going to be withheld by the GPU....Would give us a better idea on stuff no?
  17. EarlyMon

    EarlyMon The PearlyMon
    VIP Member


    The price of running the hardware and operating system is the price.

    It will vary depending on updates, too. Old PCs used to suck RAM mirroring BIOS or mapping graphics memory, no one's ever spec'd anything differently. This is really nothing new or different.

    And you are the one using the memory, ultimately, so why care how the engineers did things, be they Qualcomm, Google, HTC or in other cases, Microsoft?

    Lesson isn't monkey the spec, lesson is what you already know - more RAM is always better, regardless of computer size. And these things are like miniature laptops, amazing they do so much with so little.

    Once upon a time we had floppy disks and 64k of memory. Specs for PC today sent back in time would make it sound like you could own the Library of Congress. But as tech advances, memory use goes up for the tech.

    That's what I mean when I say the cost is the cost.
    Vanquished and BenChase7 like this.
  18. Agreed that more ram the better. But what im saying is whats the point of having say 1gb ram I the gpu is allocates to reserve/hog say 90% of that leaving the user with little left. I know thats unlikely to happen but it could and in turn effect ability to multiprocess stuff.
  19. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    I got what you were saying the first time. You're not alone - I've seen others make the same argument.

    You're not evidently understanding what I already explained, and you're exaggerating to make a point that doesn't really exist.

    How to bridge the gap?

    I got your point - but it's based on a cartoon and a post that convinced you that lost 200 MB somewhere, maybe hogged by the GPU - and that never really happened.

    They don't specify free memory because there's no one definition of free. Sense is dynamic as are the other included apps. Free the instant the phone boots up? Free after you've launched a bunch of Sense apps, and configured them differently for your use than the next guy?

    They've doubled the ram on the 3vo vs. the Evo, and they've at least doubled the capability of the 3vo vs the Evo, and they've doubled the hardware/os reserved use of that ram to provide that capability. And they've easily doubled the free ram you had left over compared to your Evo.

    And that reserved memory isn't just for the GPU, I've mentioned that already. Either way, there's reserved RAM for the architecture - just like the OS needs RAM, just like Sense needs ram.

    Ultimately, you're robbed of nothing. They could re-design so there's no reserved ram - you'd have the same free memory left over and it would very probably run as slow as molasses.
    Vanquished and BenChase7 like this.
  20. BenChase7

    BenChase7 Android Expert

    Considering how well Android manages RAM, particularly when more is needed, wouldn't it be safe to say that free RAM is wasted RAM?
  21. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    Depends. Running close to the edge is like driving with a tank nearly empty - you can do it, but I wouldn't try to go fast like that.

    Running apps can actually use ram like they're breathing - dynamicly sucking some up, dynamically giving it back. (In fact, improving that process is a cornerstone of Gingerbread.)

    So, just off the cuff I'd say the larger free pool, the less Android will have to step in and manage things, and managing things means stealing processor time from your apps to manage the layout.

    And a larger free pool may inspire some app designers to do things they couldn't before - could happen, dunno.

    And if I'm wrong and some does seem wasted - who knows what future requirements will be? What comes in the 18 months after Ice Cream Sandwich? Pineapple Upside Cake? Tiramisu? What's all that going to take?
    marctronixx, Vanquished and BenChase7 like this.
  22. marctronixx


    realize many people will be using the dual camera capability of this gadget. the camera on the evo is ram intensive and battery intensive, along with GPS usage, so factor that X 2 for the 3D portion of this new gadget and additional RAM is welcomed.

    not to mention one will have other items running alongside this, uploads, background services, etc etc, so there will be times the RAM will get a good workout. :D
    EarlyMon likes this.
  23. And I get what your saying. I just wondered if they kept the same ratio/%'s when upgrading hardware and the likes. I would assume they would but had no clue. b/c if they didn't then it could be false marketing to say you have X free when n theory you could have a slug phone possibly.
    I know im speaking extremes, just wondered if/why never seen #'s on % set asside for certain parts of the hardware ya know.
  24. ArmageddonX

    ArmageddonX Android Expert

    I vote we start a petition for "Pineapple Upside Cake"... that sound delicious...
  25. EarlyMon

    EarlyMon The PearlyMon
    VIP Member

    OK, maybe I only thought I understood what you were saying.

    Are you saying that instead of 1 GB free you only have 800 MB free here?

    If so - two things: first, they never spec'd free, second, are you counting free as anything free for Android (including user apps to use), after hardware and then coming to 800 GB?

    The set-asides for the hardware are a function of Linux and the bootloader as well as the hardware.

    Linux may set some aside on the top (in the case of use believing that display, it's 200 MB) - but it also always sets some aside on the bottom - there are set-asides at the bottom on the order of 24 MB and then another 30 MB (so your phone can't be running hog apps until it crashes and you can't get calls or can't recover from hog apps). In my mind, set-asides are set-asides, so it's all the same.

    I think you'll find the top/hardware set-asides will vary by software update and roms (it even varies by the tool used to report it without changing anything else).

    Maybe the simple answer is simple - you never see set aside here (or in desktop Linux, OS X, or Windows) because that's just an efficiency.

    In cases where a GPU or other hardware absolutely going to "steal" memory to work, those are fixed numbers per revision.

    If you look at older desktop machines using the Intel GMA 950 gpu, it would get an allocation after loading enough of the OS to support whatever resolution your monitor had. (And if you study that gpu, you know it's set aside is going to be 144 MB, minimum.) And if there were a video processor, it would get its own buffer, too.

    Here's a quote from Apple's support website discussing the same issue:

    I guess what I'm trying to say is that in the old days, functions were fixed, things could be clearly spec'd.

    With GUI operating systems, things got entangled, then with dedicated GPUs, things got more entangled, then with "3D games" (not even getting to real 3D) things got more entangled still - and so forth.

    Now - I say this in honesty, not sarcasm - maybe I'm so used to this, I'm drinking the Kool-Aide. Maybe I dispense it.

    Programmers and system designers (that would be me, too) go like this: Look! RAM! Ooooh - shiny! Wonder how much I can use to provide this cool feature and impress users? (Yes - we literally think that way (although some try to hide it).)

    How we got into this was coming from the same direction, and maybe this is semantics.

    I haven't tinkered with either the Sensation or the 3vo - but I know things about operating systems and processor cores. And my gut told me right out that with all of the improvements, I wondered if the Sensation isn't running a bit close to the wind with only 768 MB, so I feel more comfortable with 1 GB.

    Marc said it simply and best -

    I don't think they don't know the set asides or are being dishonest about them - I think they're just specifying the minimum required based on bare metal.

    The list of set-asides is likely to be large, and will vary based on user configuration (they can't control that), and OS changes on Google's side (they can't control that).

    Set-asides are hidden in plain sight - whether it starts by telling by telling that the device has 800 MB ram, or says Android System uses x% of memory, or process probes say various shared libraries under the hood are using a, b, or c% or memory - so, in my mind, set-asides are set-asides, they're all the same.

    In that picture above, the 400+ MB of "used memory" included other set-asides, lots of them.

    Do we know if those set-asides are same, less or more than the initial 200 MB set-asides?

    Honest question - do we care?

HTC EVO 3D Forum

The HTC EVO 3D release date was July 2011. Features and Specs include a 4.3" inch screen, 5MP camera, 1GB RAM, Snapdragon S3 processor, and 1730mAh battery.

July 2011
Release Date
Similar Threads - Underlying tech Evo
  1. GamerGirly
  2. Android News
  3. Ok_a
  4. Yohan Irani
  5. Android News
  6. 1337Rooster
  7. Valera_1987_01
  8. Basant Kapri
  9. Rgarner

Share This Page