os: make FormatInt64() handle LONG_MIN correctly

When compiling with gcc 15.2.0 using -O3 -m64 on Solaris SPARC & x64,
we'd get a test failure of:

Assertion failed: strcmp(logmsg, expected) == 0,
 file ../test/signal-logging.c, line 339, function logging_format

because 'num *= 1' produced a value that was out of the range of the
int64_t it was being stored in.  (Compiling with -O2 worked fine with
the same compiler/configuration/platform though.)

Signed-off-by: Alan Coopersmith <alan.coopersmith@oracle.com>
(cherry picked from commit 7f68b58865)
(cherry picked from commit 3eac9393d7)

Part-of: <https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests/2146>
This commit is contained in:
Alan Coopersmith 2025-12-19 17:10:43 -08:00 committed by Marge Bot
parent 447fec7d5e
commit 33eee35e0c

View file

@ -2094,12 +2094,14 @@ xstrtokenize(const char *str, const char *separators)
void
FormatInt64(int64_t num, char *string)
{
uint64_t unum = num;
if (num < 0) {
string[0] = '-';
num *= -1;
unum = num * -1;
string++;
}
FormatUInt64(num, string);
FormatUInt64(unum, string);
}
/* Format a number into a string in a signal safe manner. The string should be