Skip to content

Conversation

@samth
Copy link
Contributor

@samth samth commented Feb 11, 2026

Raise a catchable exception for expt with an integer base and bignum exponent, rather than consuming all memory and aborting. Also add a practical size limit in S_bignum to catch cases where the exponent is a large fixnum (e.g., on 64-bit).

This was found in the random testing for Typed Racket, see racket/typed-racket#1494.

c/alloc.c Outdated

d = size_bignum(n);
if ((uptr)d > (uptr)1 << (ptr_bits > 32 ? 38 : 28))
S_error("", "out of memory");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you say more about what this is doing? I don't understand where the constants come from, but it seems like they should be derived in "cmacro.ss". More generally, raising exceptions in the kernel instead of Scheme also creates problems; I think this would be incompatible with #1013, for example.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The constants are just plausible ones that seemed "big enough".

I will try a different approach that avoids raising exceptions in the kernel.

@mflatt
Copy link
Contributor

mflatt commented Feb 12, 2026

I see that the latest commit moves the practical-limit check to the Scheme side — thanks! — but I'm skeptical of that number. Lets assume that the magic number 40 is given a name in "cmacros.ss": maximum-plausible-bignum-bits? If I'm calculating right, 40 corresponds to a bignum that is 1/8 of a TB, and I don't see a clear reason to draw the line there. I agree that it's unlikely that anyone wants a number that large, but I think it might fit in hardware that you can buy now. Raising an error when the number is too big to fix into the representation of bignums is clearly a useful check, and that would correspond to maximum-bignum-length. Maybe I'm missing some reason that 40 is effectively a hard limit already? Otherwise, hardwiring a lower "practical" limit seems troublesome in the long run.

In the context of things like Typed Racket tests, maybe the right thing is to use custodian-memory-limit in Racket so you can pick a limit suitable for a given test suite? The expr operation exported by Racket includes a check against that limit — which may be right or may need to be refined further, but a configurable check in Racket seems like more the right place.

@samth
Copy link
Contributor Author

samth commented Feb 12, 2026

A few thoughts:

  1. Sorry for not seeing this response before adding more code; I will revise based on that and use the maximum-bignum-length.
  2. The original issue came about with an exception that was not caught by the memory limit, see the error here: https://drdr.racket-lang.org/72141/cs/racket/share/pkgs/typed-racket-test/external/tr-random-testing.rkt A specific example that triggers the abort is (expt 2 (expt -19 11)) which this commit fixes.

More generally I'm not sure what the philosophy should be here. My thought was that safe Scheme operations should not trigger aborts, even when they try to allocate large amounts of memory. But maybe that's not the goal and we should be a lot more permissive at the cost of aborts so that if there is enough memory the operation will succeed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants