Skip to content

Conversation

@gzliudan
Copy link
Collaborator

Proposed changes

The upstream libray has removed the assembly-based implementation of keccak. We need to maintain our own library to avoid a peformance regression.

Types of changes

What types of changes does your code introduce to XDC network?
Put an in the boxes that apply

  • Bugfix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation Update (if none of the other choices apply)
  • Regular KTLO or any of the maintaince work. e.g code style
  • CICD Improvement

Impacted Components

Which part of the codebase this PR will touch base on,

Put an in the boxes that apply

  • Consensus
  • Account
  • Network
  • Geth
  • Smart Contract
  • External components
  • Not sure (Please specify below)

Checklist

Put an in the boxes once you have confirmed below actions (or provide reasons on not doing so) that

  • This PR has sufficient test coverage (unit/integration test) OR I have provided reason in the PR description for not having test coverage
  • Provide an end-to-end test plan in the PR description on how to manually test it on the devnet/testnet.
  • Tested the backwards compatibility.
  • Tested with XDC nodes running this version co-exist with those running the previous version.
  • Relevant documentation has been updated as part of this PR
  • N/A

The upstream libray has removed the assembly-based implementation of
keccak. We need to maintain our own library to avoid a peformance
regression.

---------

Co-authored-by: lightclient <lightclient@protonmail.com>
Copilot AI review requested due to automatic review settings February 11, 2026 02:53
@coderabbitai
Copy link

coderabbitai bot commented Feb 11, 2026

Important

Review skipped

Auto reviews are disabled on base/target branches other than the default branch.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Tip

Issue Planner is now in beta. Read the docs and try it out! Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR vendors an in-repo legacy Keccak implementation (with amd64 assembly support) to avoid the performance regression caused by upstream removal of the assembly Keccak path, and updates the codebase to use it instead of golang.org/x/crypto/sha3.

Changes:

  • Added crypto/keccak package (vendored/modified from x/crypto/sha3) providing NewLegacyKeccak256/512.
  • Replaced sha3.NewLegacyKeccak256() usages across trie, p2p, consensus, and misc packages with keccak.NewLegacyKeccak256().
  • Updated some tests/utilities to use crypto.Keccak256Hash or the new keccak constructors.

Reviewed changes

Copilot reviewed 37 out of 39 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
trie/trie_test.go Switches sponge hashing in trie commit tests to the vendored keccak implementation.
trie/hasher.go Updates trie hasher pool to instantiate KeccakState via the vendored keccak.
tests/state_test_util.go Updates rlpHash to use vendored keccak (but now drops error handling).
p2p/server_test.go Updates RLPx MAC hashers in tests to vendored keccak.
p2p/rlpx_test.go Updates RLPx MAC hashers in tests to vendored keccak.
p2p/rlpx.go Updates handshake MAC hasher initialization to vendored keccak.
p2p/enr/idscheme.go Updates ENR signing/verification hashing to vendored keccak.
p2p/discv5/net.go Updates discv5 rlpHash to vendored keccak.
internal/ethapi/api_test.go Updates test transaction/receipt list hasher to vendored keccak.
eth/downloader/statesync.go Updates state sync keccak hasher initialization to vendored keccak.
crypto/keccak/sha3_test.go Adds KATs and benchmark tests for vendored keccak implementation.
crypto/keccak/sha3.go Adds core sponge/state logic for vendored keccak implementation.
crypto/keccak/keccakf_amd64.go Adds amd64 build stub for assembly permutation implementation.
crypto/keccak/keccakf.go Adds non-asm permutation implementation (pure Go / non-amd64 paths).
crypto/keccak/hashes.go Adds constructors for legacy Keccak-256/512 (but doc references missing APIs).
crypto/keccak/README.md Documents why keccak is vendored and why stdlib crypto/sha3 can’t be used.
crypto/keccak/LICENSE Adds upstream license for vendored keccak implementation.
crypto/keccak.go Routes crypto.NewKeccakState and hashing helpers to vendored keccak.
core/types/order_signing.go Updates order tx hashing to vendored keccak.
core/types/lending_signing.go Updates lending tx hashing to vendored keccak.
core/types/block_test.go Updates test hasher to vendored keccak.
core/txpool/legacypool/lending_pool_test.go Updates lending pool test hashing to vendored keccak.
core/state_processor_test.go Updates bad block generation hashing to vendored keccak.
core/rawdb/accessors_trie.go Updates trie node hasher pool to vendored keccak.
core/rawdb/accessors_indexes_test.go Updates test hasher to vendored keccak.
core/rawdb/accessors_chain_test.go Updates hashing in chain accessor tests to crypto.Keccak256Hash/vendored keccak.
consensus/ethash/ethash.go Updates ethash seed hasher to vendored keccak.
consensus/clique/clique.go Updates clique header signature hashing to vendored keccak.
consensus/XDPoS/engines/engine_v2/utils.go Updates XDPoS v2 signature hashing to vendored keccak.
consensus/XDPoS/engines/engine_v1/utils.go Updates XDPoS v1 signature hashing to vendored keccak.
common/types.go Updates address checksum hashing to vendored keccak.
bmt/bmt_test.go Updates BMT tests/benchmarks to use vendored keccak constructor.
accounts/accounts.go Updates EIP-191 text hash implementation to vendored keccak.
XDCxlending/lendingstate/trade.go Updates lending trade hash computation to vendored keccak.
XDCxlending/lendingstate/lendingitem_test.go Updates lending item test hashing to vendored keccak.
XDCxlending/lendingstate/lendingitem.go Updates lending item hash computation to vendored keccak.
XDCx/tradingstate/trade.go Updates trade hash computation to vendored keccak.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +160 to +165
// Sum applies padding to the hash state and then squeezes out the desired
// number of output bytes. It panics if any output has already been read.
func (d *state) Sum(in []byte) []byte {
if d.state != spongeAbsorbing {
panic("sha3: Sum after Read")
}
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar to the Write panic above, this panic string still uses the sha3: prefix even though it's in the keccak package. Consider aligning the prefix for consistency and easier debugging.

Copilot uses AI. Check for mistakes.
Comment on lines +208 to +212
func (d *state) UnmarshalBinary(b []byte) error {
if len(b) != marshaledSize {
return errors.New("sha3: invalid hash state")
}

Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All returned errors in this file use the sha3: prefix (e.g., invalid hash state/identifier/function) even though callers will import crypto/keccak. Consider updating these error strings to use a keccak: prefix (or a neutral one) for clearer diagnostics.

Copilot uses AI. Check for mistakes.
Comment on lines +349 to 351
hw := keccak.NewLegacyKeccak256()
rlp.Encode(hw, x)
hw.Sum(h[:0])
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rlp.Encode returns an error; the previous implementation panicked on encode failure, but the new code ignores the error entirely. This can mask unsupported types/encoding failures and produce an incorrect hash; please handle the returned error (e.g., panic or t.Fatalf-style failure depending on call site expectations).

Copilot uses AI. Check for mistakes.
Comment on lines +68 to +70
t.Errorf("error opening %s: %s", katFilename, err)
}
file := flate.NewReader(deflated)
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If os.Open fails, the test records an error but continues, which will likely lead to a panic when creating the flate reader/decoder. Consider switching to t.Fatalf (or returning) on open failure, and defer closing both the opened file and the flate reader to avoid leaking file descriptors during test runs.

Suggested change
t.Errorf("error opening %s: %s", katFilename, err)
}
file := flate.NewReader(deflated)
t.Fatalf("error opening %s: %s", katFilename, err)
}
defer deflated.Close()
file, err := flate.NewReader(deflated)
if err != nil {
t.Fatalf("error creating flate reader for %s: %s", katFilename, err)
}
defer file.Close()

Copilot uses AI. Check for mistakes.
Comment on lines +68 to +75
t.Errorf("error opening %s: %s", katFilename, err)
}
file := flate.NewReader(deflated)
dec := json.NewDecoder(file)
var katSet KeccakKats
err = dec.Decode(&katSet)
if err != nil {
t.Errorf("error decoding KATs: %s", err)
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On JSON decode failure, the test logs an error but continues executing with a potentially empty/partial katSet, which can cause confusing follow-on failures. Consider failing the test immediately when decoding fails (and also closing the flate reader).

Suggested change
t.Errorf("error opening %s: %s", katFilename, err)
}
file := flate.NewReader(deflated)
dec := json.NewDecoder(file)
var katSet KeccakKats
err = dec.Decode(&katSet)
if err != nil {
t.Errorf("error decoding KATs: %s", err)
t.Fatalf("error opening %s: %s", katFilename, err)
}
defer deflated.Close()
file, err := flate.NewReader(deflated)
if err != nil {
t.Fatalf("error creating flate reader for %s: %s", katFilename, err)
}
defer file.Close()
dec := json.NewDecoder(file)
var katSet KeccakKats
if err := dec.Decode(&katSet); err != nil {
t.Fatalf("error decoding KATs: %s", err)

Copilot uses AI. Check for mistakes.
// output has already been read.
func (d *state) Write(p []byte) (n int, err error) {
if d.state != spongeAbsorbing {
panic("sha3: Write after Read")
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This package is crypto/keccak, but the panic/error strings still use the sha3: prefix (e.g., "sha3: Write after Read"). This can be misleading when debugging; consider updating these strings to reference keccak (or a neutral prefix) consistently.

Suggested change
panic("sha3: Write after Read")
panic("keccak: Write after Read")

Copilot uses AI. Check for mistakes.
Comment on lines +30 to +43
// NewLegacyKeccak256 creates a new Keccak-256 hash.
//
// Only use this function if you require compatibility with an existing cryptosystem
// that uses non-standard padding. All other users should use New256 instead.
func NewLegacyKeccak256() hash.Hash {
return &state{rate: rateK512, outputLen: 32, dsbyte: dsbyteKeccak}
}

// NewLegacyKeccak512 creates a new Keccak-512 hash.
//
// Only use this function if you require compatibility with an existing cryptosystem
// that uses non-standard padding. All other users should use New512 instead.
func NewLegacyKeccak512() hash.Hash {
return &state{rate: rateK1024, outputLen: 64, dsbyte: dsbyteKeccak}
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The doc comment refers to New256/New512, but those constructors don't exist in this vendored package. This is misleading for callers; consider adjusting the comment to avoid referencing non-existent APIs (or add the missing constructors if intended).

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants