We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 3165 - define numbers don't match types in Swift
Summary: define numbers don't match types in Swift
Status: RESOLVED FIXED
Alias: None
Product: SDL
Classification: Unclassified
Component: file (show other bugs)
Version: HG 2.1
Hardware: x86 Mac OS X 10.8
: P2 enhancement
Assignee: Sam Lantinga
QA Contact: Sam Lantinga
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2015-11-06 22:30 UTC by C.W. Betts
Modified: 2016-10-01 20:36 UTC (History)
0 users

See Also:


Attachments
Add u to defines (6.64 KB, patch)
2016-02-02 01:59 UTC, C.W. Betts
Details | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description C.W. Betts 2015-11-06 22:30:45 UTC
Swift is very strict with types, so much that those of different signedness/size must be cast. Most of the defines are imported as 32-bit signed integers, while the corresponding field in a struct is a 32-bit unsigned integer. Appending a "u" would cause the defined types to be imported as 32-bit unsigned integers.
Comment 1 C.W. Betts 2016-02-02 01:59:11 UTC
Created attachment 2372 [details]
Add u to defines

This patch changes all the define constants to end with a "u", making them import into Swift as UInt32.
Comment 2 Sam Lantinga 2016-10-01 20:36:06 UTC
Fixed, thanks!
https://hg.libsdl.org/SDL/rev/dc59df175689