Creating a font with optionScale = Just (15, 15)
, then querying it with fontScale
, returns (283467841551,283467841551)
. It should return (15,15)
.
It is probably trying to read a 32-bit integer as a 64-bit integer. The lower 32 bits of 283467841551 are equal to 15.
Encountered on x86_64 architecture.
Interesting, the Harfbuzz reference docs indicate that on the C side these should be machine words, hence why I coded it the way I did.
I'm fixing this issue, but shakes my confidence in the docs I was coding to...