Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2.0.4 OS-X binary release is x86_64 only #2203

Closed
SDLBugzilla opened this issue Feb 11, 2021 · 0 comments
Closed

2.0.4 OS-X binary release is x86_64 only #2203

SDLBugzilla opened this issue Feb 11, 2021 · 0 comments

Comments

@SDLBugzilla
Copy link
Collaborator

SDLBugzilla commented Feb 11, 2021

This bug report was migrated from our old Bugzilla tracker.

Reported in version: 2.0.4
Reported for operating system, platform: Mac OS X (All), x86

Comments on the original bug report:

On 2016-06-23 08:39:26 +0000, Richard Russell wrote:

The OS-X binary release (SDL2-2.0.4.dmg) is x86_64 only rather than a 32/64-bit universal binary. Since SDL2 supports OS 10.6 (Snow Leopard), which runs on 32-bit Macs, and since some applications for later OS versions may need to be built as 32-bit (for example if they contain assembler code) there remains a need for a 32-bit binary for the foreseeable future.

On 2016-10-12 15:03:00 +0000, Eric wing wrote:

This is probably going to be unpopular, but I think the current distribution is fine and should not be changed. I’m going to describe the Apple development ecosystem here. Remember that I’m just the messenger when talking about that part. (A lot of this response is consolidation of various related posts on this subject.)

On Apple Development

Apple expects developers to always use the latest stable Xcode and SDK. Period. Developers are expected to build for the latest and do backwards compatibility as appropriate.

This is a fundamental design principle ingrained in Apple’s approach. Their reasoning is they can’t predict the future, but can back-fix. This is employed in all areas of their technology from their compilers, to their frameworks, and to the OS runtime itself. (Apple will detect how your app was compiled and as necessary run different or legacy code paths as needed.)

You are a developer, not a normal user in Apple’s eyes, so you are expected run the latest developer tools, which also means having a computer new enough to run them (a computer made within the last ~7 years and you update to the latest (and free) OS version)

I don’t care to hear that you hate this, but this is a simple matter of fact developing for Apple platforms. Every Apple dev has to deal with this. (And there are Mac cloud rental services if you are unwilling to get a capable Mac.)

And by the way, Apple will reject your app if you do not build against the latest SDK (especially on iOS).

On 32-bit Intel

I don’t think it is worth going back. Apple only had a couple 32-bit Intel models in their lowest end Mac skews when they first switched to Intel from PowerPC in 2006. Not that many were sold in the grand scheme of things and it was only a very short period.

This was 10 years ago. I think it is far past time to move on. In my opinion, the chances of an end user on one of these machines installing brand new software made today is extremely low. If you cite yourself, refer to the above….you are a developer, not an end-user.

And incidentally, this was also the transition period where new PowerPC Macs were still in the line up as Intel was being rolled out. So if the argument is merely about date, then we should be arguing to bring back PowerPC support too. (Though there even more hurdles against that.)

From a technical standpoint, the Objective-C runtime in 32-bit has major historical baggage as to not break the ABI. This has become more and more of a problem as the “modern” Obj-C runtime (64-bit Mac, iOS, AppleTV, Apple Watch) continues to advance and diverge which means a larger set of features just don’t work in 32-bit x86. Even a veteran Cocoa dev like myself has forgotten a lot of minutia about the compatibility matrix between the two. The world has moved on. Obj-C/Cocoa is already a niche skill and it’s already a challenge to find devs who can both handle Obj-C/Cocoa and SDL. Though I’m not advocating we gut the codebase of 32-bit support, I am concerned that time is coming where we will need to decide in favor of the modern Obj-C runtime and clinging to the legacy is holding things back from serving the most users with the best experience possible. (I was actually prototyping a way to interoperate SDL with native Cocoa views for complex GUI scenarios…if I used modern Obj-C features, I could implement most of it with minimal impact on the SDL internals. But if I needed to support legacy 32-bit Obj-C, it required substantial reorganization of SDL internals. This is still in prototype phase so I punted on the 32-bit support and nobody needs to care about my stuff yet. But the larger question is, how many other potential patches did we lose because developers were scared away by the legacy codebase requirements?)

On old OS X versions

It is really hard to test on old OS X versions. New Macs can’t run OS X versions older than the one they shipped with. It gets increasingly hard to test older versions as they go farther back in time. Every OS X version seems to change and break something or adds a new feature we must fix (e.g. retina screens and a running joke was Apple changed/broke the fullscreen API every release). Backwards compatibility is a feature, not a given. All features have cost and complexity. Everybody does their best to retain backwards compatibility, but I think it is disingenuous to flaunt compatibility way back to OS and computers made 10 years ago when this is clearly not well tested and used any more.

Apple has made recent OS versions free upgrades. So the majority of users can be on the latest. (And the old Mac developer rule of thumb is the majority of people who actually download new software are on the latest or minus one.) When Apple cuts off computers, this is usually past the 7 year mark. Why are we expected to support these on new binaries when Apple doesn’t? Again, I’m not saying we remove the source code (caveats about the aforementioned legacy 32-bit runtime). But I am saying we be realistic about what the common case actually is. If you look at the Steam or Omni Group stats, there are zero 32-bit Macs, and 10.6 doesn’t even have its own category any more. For non-typical cases, developers can recompile SDL themselves for additional architectures.

10.5 64-bit did not work fully on OS X (Cocoa 64-bit wasn’t finished at the time), 10.6 did. My recommendation is to do what we have now: to build 64-bit only and set the backwards compatibility target to 10.6. I don’t want this version number to be some marketing badge of honor that we flaunt and set in stone. What we support should actually work and work well. But since we have to set it to something and it might still work and setting this flag doesn’t cost us anything, 10.6 seems for the moment doable without a lot of trade-offs. (Though if somebody reports a bug on 10.6, what we do should be on a case by case basis.) Whereas, shipping a fat binary with 32-bit support bloats the binary size and implies all downstream library authors (i.e. SDL satellites and many other friends) should do the same, which also has implications on how they maintain their build systems and do testing. That’s an unfair expectation and burden now-a-days. And most of these authors have their hands full already with iOS (multiple architectures) and maybe AppleTV.

x86_64h (Somebody brought up x86_64h, ‘h’ is for Haswell?)

As for x86_64h, I was completely unaware that was an option. I still don’t see it in Xcode. I was always under the assumption that they couldn’t slice architectures that finely. And my impression, the only reason to care about newer x86 chips instruction sets is for newer SSE instructions. Historically, the SSE vector instruction flags were something you specified separately with your build and you have to be careful to not load that code on older processors, not a FAT slice. But assuming that is an option, I think that has other implications. The way fat binaries have historically worked, it means all libraries must build for that architecture too, which means both Apple’s libraries and 3rd party downstream libraries. I also suspect Haswell implies an OS version bump (10.8?). This means dual OS min versions if done as fat binaries, or if we do Haswell flags for the x86_64 binary, that means we must bump up the min OS version.

Based on the stats, I don’t think that is unreasonable. However, I don’t know how much performance benefit we’ll actually get (I don’t expect much, if any…generally you have to write the intrinsics yourself to get any real performance. Compilers still generally fall down on autovectorization beyond the most trivial cases.), and whether it is worth the political fight (probably not). The fact is that we’re still discussing a 10 year old 32-bit computer they sold very few of which is got to be less than 1% now. I think a 10.8 Haswell fight is too hard and nobody really cares about this, and I think it would again have negative implications on downstream library authors. And again, this kind of specialization can be done by compiling yourself. The time to switch to x86_64h would be only after Apple makes this default.

If we decide we want x86_64h (doesn’t that require 10.8?), I think we should switch entirely and not do fat binaries and dual OS min versions. Based on the stats, I don’t think that is unreasonable. However, I don’t know how much performance benefit we’ll actually get (I don’t expect much if any), and whether it is worth the political fight (probably not). The fact is that we’re still discussing a 10 year old computer they sold very few of which is got to less than 1% now. I think a 10.8 Haswell fight is too hard and nobody really cares about this, and I think it would again have negative implications on downstream library authors. And again, this kind of specialization can be done by compiling yourself. I don't think SDL should do this yet.

Running 32-bit on 64-bit systems penalty

Also, the way Mac does 32-bit and 64-bit means pulling in different copies of OS shared libraries. In this day and age where everybody has completed their 64-bit migration, if you are the sole app using 32-bit on the system, you waste all the RAM on the system and slow things down like your launch time. This is a bigger issue for older Macs which are slower and have less RAM, i.e. Macs from 10 years ago. (And Apple started making their own apps 64-bit starting in 10.6, so 64-bit is already loaded.)

On 2016-10-12 20:36:27 +0000, Richard Russell wrote:

(In reply to Eric wing from comment # 1)

May I ask whether there is a technical difficulty in building SDL2 2.0.5 as a 32/64-bit universal binary, or is your motivation purely a 'political' one designed to drive apps such as mine (which contains a large amount of 32-bit assembler code) off the market?

I understand that the SDL version-numbering scheme is major.minor.patch, i.e. the difference between 2.0.3 and 2.0.5 is a 'patch' only. It is my opinion that such a far-reaching change as withdrawing support for 32-bit builds cannot be categorised as a 'patch'.

On 2016-10-12 21:07:40 +0000, Ryan C. Gordon wrote:

May I ask whether there is a technical difficulty in building SDL2 2.0.5 as
a 32/64-bit universal binary, or is your motivation purely a 'political' one
designed to drive apps such as mine (which contains a large amount of 32-bit
assembler code) off the market?

We aren't driving anyone off the market. The upcoming 2.0.5 still builds for 32-bit, whether we ship prebuilt binaries or not.

I understand that the SDL version-numbering scheme is major.minor.patch,
i.e. the difference between 2.0.3 and 2.0.5 is a 'patch' only. It is my
opinion that such a far-reaching change as withdrawing support for 32-bit
builds cannot be categorised as a 'patch'.

This is an unfortunate quirk of our development model; 2.0.3 and 2.0.4 were months and years apart, and add new APIs, etc. We guarantee backwards compat with patch revisions, and will break compat with 2.1, someday.

At this moment, x86 compat doesn't need the heroic efforts that PowerPC would. Eric isn't wrong that it's not a big market, but it's not unreasonable to support in SDL, at least for now. I'll ask Sam about the builds, but in the worst case, you can definitely build a 32-bit SDL from source code.

(But as far as Apple's concerned: the writing is on the wall. You should be moving away from your 32-bit asm code no matter what. Eventually Apple will drive you off the market.)

--ryan.

On 2016-10-13 01:48:06 +0000, Sam Lantinga wrote:

This was a bug introduced when upgrading the Xcode project. Fixed for 2.0.5 release.
https://hg.libsdl.org/SDL/rev/8f4b7d58bb32

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant