• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

Are newer Android phones capable of running this?

I'm on the note 3 running this the PPSSPP, and even with the phone's specs:

Samsung Galaxy Note 3 - Wikipedia, the free encyclopedia

GTA liberty city stories still manages to be incredibly laggy

Anybody know what's going on here?

The issue is the game itself, I've upgraded phones and tablets a few times since GTA games were released on Android and the performance has been the same regardless.

Nothing you can do about it, sadly the coding is so bad in many Android games faster hardware won't help you much.
 
Upvote 0
You've gotten some good answers.

I'm a developer, so let me chime in a bit:

There can be several different reasons for the descriptive phrase "lag" - and several different meanings.

The implication in most of the posts thus far is performance / coding / emulating.

The GTA series is based on an engine which is made for PC/Console platforms, SQUEEZED into a phone.

The graphics system API is OpenGL ES, usually version 2 but some version 3 devices are in the wild, and increasing.

This API is quite similar to OpenGL for the desktop, somewhere around late version 2 or 3 - that is, programmable fragment shaders and a simplified method of sending geometry and data to the GPU's RAM. It's so similar that, if the OpenGL desktop was coded FOR this concept, there's very little difference between OpenGL and OpenGL ES of a related version.

That's a big if. OpenGL for the desktop is larger, has more features. If they use them on the desktop, they have to write a separate renderer for ES (x).

The point is that most engines have been designed so that graphics in the application are written NOT to OpenGL, but to a set of classes which abstract the concept of OpenGL - so they can switch to ANY API, including DirectX, by substituting the rendering portion of the code.

Generally this is done so the switch can be done LIVE. That's ok for desktop or consoles, but it's a little too heavy for mobile. Mobile graphics engines should NOT be written so the switch between one version of the graphics API and another can be done at runtime - as on a menu for the user to choose.

The application should be built with a solid, non mutating API link - to lighten the load on the CPU. It can still mutate, just not using "virtual functions" - as in the typical method on desktop engines.

Yet, this isn't the entire store. They have to mix in an engine for audio, another for I/O (usually substituting game controller or mouse input for touch and accelerometer or related input differences), and then physics.

Physics is a big issue. It's intensive, usually runs in it's own thread or threads, sometimes it's done in the GPU - but however it's done, getting a dual core phone to pull it off in such a way that compares to a modern console or desktop is tough to do.

Yet, it's possible. The (now old) Kindle Fire 1st Gen is a typical target that should be reasonably capable, given the limits of the PowerVR SGX 540.

Then...there's something so simple (that's responsible for lag) it's nearly laughable.

It's actually not necessarily lag due to the engine. They may get the entire engine "right" - working fast, light - evenly distributed between audio, physics, graphics, etc...and then, screw up the input system.

Take the accelerometer input - on a graph it looks like an audio signal, because basically it is in a way. It's a low frequency measure of motion in 3 dimensions, which, over time, looks exactly like the low frequency filtered portion of an audio stream.

It's also jittery. It can give you impulses in the 20 to 100 hz region, depending on the device. We filter it.

The filter is a typical low pass filter used for audio ( the bass control, if you will ).

We filter out higher frequencies, so the jitter isn't used as input to the controls.

The problem is - that filter is STUPID about it.

It causes lag...sometimes HORRIBLE lag.

You loose all high amplitude response. The standard filter, exampled in every text on the subject of game engine development I've ever seen that discusses it, and the original filter example on the Apple website for iOS to this day, is the wrong filter. And it's nearly everywhere.

What's required is a self adjusting filter. One that reduces low amplitude higher frequency signal (the jitter), but still passes through high amplitude impulses (your control in a quick turn, for example) to the game engine.

The typical "answer" I see is simple, reduce the effect of the simple filter (that is, grab the knob and turn it up - letting more high frequencies through). That introduces jitter - makes the game jumpy, difficult to control with accelerometer.

A different algorithm is required - one that uses a parametric EQ which adjusts according to the unfiltered input - but so far as I can tell I may be the only one using them in development, but hopefully I'm wrong about that. If you have a game with a bezier curve you adjust to control accelerometer response, it's either from my code or someone is using the right filter in their code.

You might jump into say, "But I'm using a controller...xyz model, etc."...

I know.

Here's the problem - game engines generalize I/O. Your game controller might be errantly passing through these wrong filters without need.

Creating lag.

It's because the developer didn't REALLY understand that adding in mobile platform I/O also had to be bypassed for non-mobile style controllers, even on Android.

It happens.

They'll get it right eventually.
 
Upvote 0

BEST TECH IN 2023

We've been tracking upcoming products and ranking the best tech since 2007. Thanks for trusting our opinion: we get rewarded through affiliate links that earn us a commission and we invite you to learn more about us.

Smartphones