49 Commits

Author SHA1 Message Date
524db5af01 Merge pull request #105 from hl-archive-node/feat/cache-spot-meta
feat: cache spot metadata in database to reduce API calls
2025-11-05 02:56:06 -05:00
98cc4ce30b feat: cache spot metadata in database to reduce API calls
Implements persistent caching of ERC20 contract address to spot token ID
mappings in the database to minimize API requests and improve performance.

Changes:
- Add SpotMetadata database table for persistent storage
- Implement load_spot_metadata_cache() to initialize cache on startup
- Add init_spot_metadata() for init-state command to pre-populate cache
- Extract store_spot_metadata() helper to DRY serialization logic
- Enable on-demand API fetches with automatic database persistence
- Integrate cache loading in main node startup flow

The cache falls back to on-demand API fetches if database is empty,
with automatic persistence of fetched data for future use.
2025-11-05 07:48:42 +00:00
95c653cf14 chore: fmt 2025-11-05 07:46:22 +00:00
cb73aa7dd4 Merge pull request #104 from hl-archive-node/chore/discovery-local-only
feat: Default to localhost-only network, add --allow-network-overrides
2025-11-05 02:44:54 -05:00
2a118cdacd feat: Default to localhost-only network, add --allow-network-overrides
Network now defaults to localhost-only (local discovery/listener, no DNS/NAT).
Use --allow-network-overrides flag to restore CLI-based network configuration.
2025-11-05 07:38:24 +00:00
ff272fcfb3 Merge pull request #103 from hl-archive-node/chore/tx-spec
fix: Adjust transaction parser
2025-11-05 02:08:52 -05:00
387acd1024 fix: Adjust transaction parser based on observation on all blocks in both networks
Tracked by #97
2025-11-05 07:00:41 +00:00
010d056aad Merge pull request #102 from hl-archive-node/fix/testnet-txs-tracking
fix: Fix testnet transaction types
2025-11-04 12:24:04 -05:00
821c63494e fix: Fix testnet transaction types 2025-11-04 17:23:31 +00:00
f915aba568 Merge pull request #100 from hl-archive-node/feat/deprecate-migrator
feat: Place migrator behind `CHECK_DB_MIGRATION` env
2025-11-01 06:23:01 -04:00
1fe03bfc41 feat: Place migrator behind CHECK_DB_MIGRATION env 2025-11-01 09:36:30 +00:00
893822e5b0 Merge pull request #98 from hl-archive-node/fix/testnet-system-tx 2025-10-26 03:39:06 -04:00
c2528ce223 fix: Support certain types of system tx 2025-10-26 06:42:14 +00:00
d46e808b8d Merge pull request #94 from hl-archive-node/fix/migrator-typo
fix(migrate): Fix wrong chunk ranges
2025-10-16 10:42:59 -04:00
497353fd2f fix(migrate): Fix wrong chunk ranges 2025-10-16 14:35:04 +00:00
eee6eeb2fc Merge pull request #93 from hl-archive-node/fix/subscriptions
fix: Prevent #89 from overriding --hl-node-compliant subscriptions
2025-10-13 01:27:19 -04:00
611e6867bf fix: Do not override --hl-node-compliant for subscription 2025-10-13 02:57:25 +00:00
6c3ed63c3c fix: Override NewHeads only 2025-10-13 02:57:05 +00:00
51924e9671 Merge pull request #91 from hl-archive-node/fix/debug-cutoff
fix: Fix --debug-cutoff-height semantics
2025-10-11 22:29:45 -04:00
8f15aa311f fix: Fix --debug-cutoff-height semantics
NOTE: This is a debug feature not on by default.

The original intention of it was limiting the highest block number. But it was instead enforcing the starting block number for fetching, leading to block progression.
2025-10-12 02:22:55 +00:00
bc66716a41 Merge pull request #89 from hl-archive-node/cleanup
fix: Convert header type for eth_subscribe
2025-10-10 23:09:33 -04:00
fc819dbba2 test: Add regression tests 2025-10-11 02:52:09 +00:00
1c5a22a814 fix: Convert header type for eth_subscribe
Due to custom header usage, only `eth_subscribe` method was returning the new header format in raw format, while other part were using RpcConvert to convert headers.

Make `eth_subscribe` newHeads to return the `inner` field (original eth header) instead.
2025-10-11 02:49:19 +00:00
852e186b1a Merge pull request #88 from hl-archive-node/hotfix
hotfix: Mark migrator experimantal
2025-10-09 04:55:53 -04:00
f83326059f chore: clippy 2025-10-09 08:55:40 +00:00
ca8c374116 feat: Mark migrator as experimental 2025-10-09 08:49:29 +00:00
5ba12a4850 perf: adjust chunk size, do not hold tx too long 2025-10-09 08:20:22 +00:00
8a179a6d9e perf: Use smaller chunks 2025-10-09 08:13:53 +00:00
d570cf3e8d fix: Create directory before migration 2025-10-09 08:13:45 +00:00
0e49e65068 Merge pull request #86 from hl-archive-node/breaking/hl-header
feat(breaking): Use custom header format (HlHeader)
2025-10-09 02:51:09 -04:00
13b63ff136 feat: add migrator for mdbx as well 2025-10-09 06:35:56 +00:00
233026871f perf: chunkify block ranges 2025-10-08 13:54:16 +00:00
7e169d409d chore: Change branch to v1.8.2-fork-hl-header 2025-10-08 13:04:11 +00:00
47aaad6ed9 feat: add migrator 2025-10-08 13:03:51 +00:00
9f73b1ede0 refactor: Move BlockBody from transaction to body 2025-10-06 06:43:17 +00:00
bcdf4d933d feat(breaking): Use HlHeader for HlPrimitives 2025-10-06 06:21:08 +00:00
2390ed864a feat(breaking): Use HlHeader for storing header 2025-10-06 06:21:08 +00:00
567d6ce2e4 feat: Introduce HlHeader 2025-10-06 06:21:08 +00:00
8b2c3a4a34 refactor: Move primitives into files 2025-10-06 06:21:08 +00:00
92759f04db Merge pull request #84 from hl-archive-node/fix/no-panic
fix: Fix panic when block receipts are called on non-existing blocks
2025-10-05 19:47:22 -04:00
71bb70bca6 fix: Fix panic when block receipts are called on non-existing blocks 2025-10-05 14:54:55 +00:00
5327ebc97a Merge pull request #82 from hl-archive-node/fix/local-reader
fix(local-ingest-dir): Use more robust resumption for hl-node line reader, fix block number increment for reading files
2025-10-05 07:36:32 -04:00
4d83b687d4 feat: Add metrics for file read triggered
Usually, "Loading block data from ..." shouldn't be shown in logs at all. Add metrics to detect the file read.
2025-10-05 11:28:11 +00:00
12f366573e fix: Do not increase block counter when no block is read
This made ingest loop to infinitely increase the block number
2025-10-05 11:28:11 +00:00
b8bae7cde9 fix: Utillize LruMap better
LruMap was introduced to allow getting the same block twice, so removing the item when getting the block doesn't make sense.
2025-10-05 11:28:11 +00:00
0fd4b7943f refactor: Use offsets instead of lines, wrap related structs in one 2025-10-05 11:28:04 +00:00
bfd61094ee chore: cargo fmt 2025-10-05 09:58:13 +00:00
3b33b0a526 Merge pull request #81 from hl-archive-node/fix/typo-local
fix: Fix typo in --local (default hl-node dir)
2025-10-05 05:54:35 -04:00
de7b524f0b fix: Fix typo in --local (default hl-node dir) 2025-10-05 04:39:09 -04:00
40 changed files with 1924 additions and 751 deletions

218
Cargo.lock generated
View File

@ -6744,7 +6744,7 @@ checksum = "95325155c684b1c89f7765e30bc1c42e4a6da51ca513615660cb8a62ef9a88e3"
[[package]] [[package]]
name = "reth" name = "reth"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-rpc-types", "alloy-rpc-types",
"aquamarine", "aquamarine",
@ -6790,7 +6790,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-basic-payload-builder" name = "reth-basic-payload-builder"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -6814,7 +6814,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-chain-state" name = "reth-chain-state"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -6845,7 +6845,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-chainspec" name = "reth-chainspec"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-chains", "alloy-chains",
"alloy-consensus", "alloy-consensus",
@ -6865,7 +6865,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-cli" name = "reth-cli"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-genesis", "alloy-genesis",
"clap", "clap",
@ -6879,7 +6879,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-cli-commands" name = "reth-cli-commands"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-chains", "alloy-chains",
"alloy-consensus", "alloy-consensus",
@ -6960,7 +6960,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-cli-runner" name = "reth-cli-runner"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"reth-tasks", "reth-tasks",
"tokio", "tokio",
@ -6970,7 +6970,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-cli-util" name = "reth-cli-util"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -6988,7 +6988,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-codecs" name = "reth-codecs"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7008,7 +7008,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-codecs-derive" name = "reth-codecs-derive"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"convert_case", "convert_case",
"proc-macro2", "proc-macro2",
@ -7019,7 +7019,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-config" name = "reth-config"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"eyre", "eyre",
"humantime-serde", "humantime-serde",
@ -7034,7 +7034,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-consensus" name = "reth-consensus"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -7047,7 +7047,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-consensus-common" name = "reth-consensus-common"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7059,7 +7059,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-consensus-debug-client" name = "reth-consensus-debug-client"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7085,7 +7085,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-db" name = "reth-db"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"derive_more", "derive_more",
@ -7111,7 +7111,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-db-api" name = "reth-db-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-genesis", "alloy-genesis",
@ -7139,7 +7139,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-db-common" name = "reth-db-common"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-genesis", "alloy-genesis",
@ -7169,7 +7169,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-db-models" name = "reth-db-models"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -7184,7 +7184,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-discv4" name = "reth-discv4"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"alloy-rlp", "alloy-rlp",
@ -7210,7 +7210,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-discv5" name = "reth-discv5"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"alloy-rlp", "alloy-rlp",
@ -7234,7 +7234,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-dns-discovery" name = "reth-dns-discovery"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"data-encoding", "data-encoding",
@ -7258,7 +7258,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-downloaders" name = "reth-downloaders"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7288,7 +7288,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ecies" name = "reth-ecies"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"aes", "aes",
"alloy-primitives", "alloy-primitives",
@ -7319,7 +7319,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-engine-local" name = "reth-engine-local"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -7341,7 +7341,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-engine-primitives" name = "reth-engine-primitives"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7366,7 +7366,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-engine-service" name = "reth-engine-service"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"futures", "futures",
"pin-project", "pin-project",
@ -7389,7 +7389,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-engine-tree" name = "reth-engine-tree"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7438,7 +7438,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-engine-util" name = "reth-engine-util"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-rpc-types-engine", "alloy-rpc-types-engine",
@ -7466,7 +7466,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-era" name = "reth-era"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7482,7 +7482,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-era-downloader" name = "reth-era-downloader"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"bytes", "bytes",
@ -7497,7 +7497,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-era-utils" name = "reth-era-utils"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -7519,7 +7519,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-errors" name = "reth-errors"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"reth-consensus", "reth-consensus",
"reth-execution-errors", "reth-execution-errors",
@ -7530,7 +7530,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-eth-wire" name = "reth-eth-wire"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-chains", "alloy-chains",
"alloy-primitives", "alloy-primitives",
@ -7559,7 +7559,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-eth-wire-types" name = "reth-eth-wire-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-chains", "alloy-chains",
"alloy-consensus", "alloy-consensus",
@ -7583,7 +7583,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ethereum-cli" name = "reth-ethereum-cli"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"clap", "clap",
"eyre", "eyre",
@ -7605,7 +7605,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ethereum-consensus" name = "reth-ethereum-consensus"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7621,7 +7621,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ethereum-engine-primitives" name = "reth-ethereum-engine-primitives"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -7639,7 +7639,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ethereum-forks" name = "reth-ethereum-forks"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eip2124", "alloy-eip2124",
"alloy-hardforks", "alloy-hardforks",
@ -7653,7 +7653,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ethereum-payload-builder" name = "reth-ethereum-payload-builder"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7682,7 +7682,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ethereum-primitives" name = "reth-ethereum-primitives"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7702,7 +7702,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-etl" name = "reth-etl"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"rayon", "rayon",
"reth-db-api", "reth-db-api",
@ -7712,7 +7712,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-evm" name = "reth-evm"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7735,7 +7735,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-evm-ethereum" name = "reth-evm-ethereum"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7756,7 +7756,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-execution-errors" name = "reth-execution-errors"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-evm", "alloy-evm",
"alloy-primitives", "alloy-primitives",
@ -7769,7 +7769,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-execution-types" name = "reth-execution-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7787,7 +7787,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-exex" name = "reth-exex"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -7825,7 +7825,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-exex-types" name = "reth-exex-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -7839,7 +7839,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-fs-util" name = "reth-fs-util"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"serde", "serde",
"serde_json", "serde_json",
@ -7849,7 +7849,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-invalid-block-hooks" name = "reth-invalid-block-hooks"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -7876,7 +7876,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ipc" name = "reth-ipc"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"bytes", "bytes",
"futures", "futures",
@ -7896,7 +7896,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-libmdbx" name = "reth-libmdbx"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"bitflags 2.9.2", "bitflags 2.9.2",
"byteorder", "byteorder",
@ -7912,7 +7912,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-mdbx-sys" name = "reth-mdbx-sys"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"bindgen 0.71.1", "bindgen 0.71.1",
"cc", "cc",
@ -7921,7 +7921,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-metrics" name = "reth-metrics"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"futures", "futures",
"metrics", "metrics",
@ -7933,7 +7933,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-net-banlist" name = "reth-net-banlist"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
] ]
@ -7941,7 +7941,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-net-nat" name = "reth-net-nat"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"futures-util", "futures-util",
"if-addrs", "if-addrs",
@ -7955,7 +7955,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-network" name = "reth-network"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8010,7 +8010,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-network-api" name = "reth-network-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -8035,7 +8035,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-network-p2p" name = "reth-network-p2p"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8057,7 +8057,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-network-peers" name = "reth-network-peers"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"alloy-rlp", "alloy-rlp",
@ -8072,7 +8072,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-network-types" name = "reth-network-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eip2124", "alloy-eip2124",
"humantime-serde", "humantime-serde",
@ -8086,7 +8086,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-nippy-jar" name = "reth-nippy-jar"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"bincode", "bincode",
@ -8103,7 +8103,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-api" name = "reth-node-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-rpc-types-engine", "alloy-rpc-types-engine",
"eyre", "eyre",
@ -8127,7 +8127,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-builder" name = "reth-node-builder"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8195,7 +8195,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-core" name = "reth-node-core"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8247,7 +8247,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-ethereum" name = "reth-node-ethereum"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-network", "alloy-network",
@ -8285,7 +8285,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-ethstats" name = "reth-node-ethstats"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -8309,7 +8309,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-events" name = "reth-node-events"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8333,7 +8333,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-metrics" name = "reth-node-metrics"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"eyre", "eyre",
"http 1.3.1", "http 1.3.1",
@ -8354,7 +8354,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-node-types" name = "reth-node-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"reth-chainspec", "reth-chainspec",
"reth-db-api", "reth-db-api",
@ -8366,7 +8366,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-optimism-primitives" name = "reth-optimism-primitives"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8385,7 +8385,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-payload-builder" name = "reth-payload-builder"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -8406,7 +8406,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-payload-builder-primitives" name = "reth-payload-builder-primitives"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"pin-project", "pin-project",
"reth-payload-primitives", "reth-payload-primitives",
@ -8418,7 +8418,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-payload-primitives" name = "reth-payload-primitives"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -8438,7 +8438,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-payload-validator" name = "reth-payload-validator"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-rpc-types-engine", "alloy-rpc-types-engine",
@ -8448,7 +8448,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-primitives" name = "reth-primitives"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"c-kzg", "c-kzg",
@ -8462,7 +8462,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-primitives-traits" name = "reth-primitives-traits"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8495,7 +8495,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-provider" name = "reth-provider"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8540,7 +8540,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-prune" name = "reth-prune"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8568,7 +8568,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-prune-types" name = "reth-prune-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"arbitrary", "arbitrary",
@ -8582,7 +8582,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ress-protocol" name = "reth-ress-protocol"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -8601,7 +8601,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-ress-provider" name = "reth-ress-provider"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -8628,7 +8628,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-revm" name = "reth-revm"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"reth-primitives-traits", "reth-primitives-traits",
@ -8641,7 +8641,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc" name = "reth-rpc"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-dyn-abi", "alloy-dyn-abi",
@ -8720,7 +8720,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-api" name = "reth-rpc-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-genesis", "alloy-genesis",
@ -8748,7 +8748,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-builder" name = "reth-rpc-builder"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-network", "alloy-network",
"alloy-provider", "alloy-provider",
@ -8787,7 +8787,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-convert" name = "reth-rpc-convert"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-json-rpc", "alloy-json-rpc",
@ -8808,7 +8808,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-engine-api" name = "reth-rpc-engine-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -8838,7 +8838,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-eth-api" name = "reth-rpc-eth-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-dyn-abi", "alloy-dyn-abi",
@ -8882,7 +8882,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-eth-types" name = "reth-rpc-eth-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -8929,7 +8929,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-layer" name = "reth-rpc-layer"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-rpc-types-engine", "alloy-rpc-types-engine",
"http 1.3.1", "http 1.3.1",
@ -8943,7 +8943,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-rpc-server-types" name = "reth-rpc-server-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -8959,7 +8959,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-stages" name = "reth-stages"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -9003,7 +9003,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-stages-api" name = "reth-stages-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -9030,7 +9030,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-stages-types" name = "reth-stages-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"arbitrary", "arbitrary",
@ -9044,7 +9044,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-static-file" name = "reth-static-file"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"parking_lot", "parking_lot",
@ -9064,7 +9064,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-static-file-types" name = "reth-static-file-types"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"clap", "clap",
@ -9076,7 +9076,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-storage-api" name = "reth-storage-api"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -9099,7 +9099,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-storage-errors" name = "reth-storage-errors"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-eips", "alloy-eips",
"alloy-primitives", "alloy-primitives",
@ -9115,7 +9115,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-tasks" name = "reth-tasks"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"auto_impl", "auto_impl",
"dyn-clone", "dyn-clone",
@ -9133,7 +9133,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-tokio-util" name = "reth-tokio-util"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"tokio", "tokio",
"tokio-stream", "tokio-stream",
@ -9143,7 +9143,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-tracing" name = "reth-tracing"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"clap", "clap",
"eyre", "eyre",
@ -9158,7 +9158,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-transaction-pool" name = "reth-transaction-pool"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -9198,7 +9198,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-trie" name = "reth-trie"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-eips", "alloy-eips",
@ -9223,7 +9223,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-trie-common" name = "reth-trie-common"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-consensus", "alloy-consensus",
"alloy-primitives", "alloy-primitives",
@ -9249,7 +9249,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-trie-db" name = "reth-trie-db"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"reth-db-api", "reth-db-api",
@ -9262,7 +9262,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-trie-parallel" name = "reth-trie-parallel"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"alloy-rlp", "alloy-rlp",
@ -9287,7 +9287,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-trie-sparse" name = "reth-trie-sparse"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"alloy-rlp", "alloy-rlp",
@ -9306,7 +9306,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-trie-sparse-parallel" name = "reth-trie-sparse-parallel"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"alloy-primitives", "alloy-primitives",
"alloy-rlp", "alloy-rlp",
@ -9324,7 +9324,7 @@ dependencies = [
[[package]] [[package]]
name = "reth-zstd-compressors" name = "reth-zstd-compressors"
version = "1.8.2" version = "1.8.2"
source = "git+https://github.com/hl-archive-node/reth?rev=83baf84bcb6d88081fc1b39f97733b8ec345cb88#83baf84bcb6d88081fc1b39f97733b8ec345cb88" source = "git+https://github.com/hl-archive-node/reth?rev=416c2e26756f1c8ee86e6b8e4081f434952b3a1a#416c2e26756f1c8ee86e6b8e4081f434952b3a1a"
dependencies = [ dependencies = [
"zstd", "zstd",
] ]

View File

@ -26,49 +26,49 @@ lto = "fat"
codegen-units = 1 codegen-units = 1
[dependencies] [dependencies]
reth = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-cli = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-cli = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-cli-commands = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-cli-commands = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-basic-payload-builder = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-basic-payload-builder = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-db = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-db = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-db-api = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-db-api = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-chainspec = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-chainspec = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-cli-util = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-cli-util = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-discv4 = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-discv4 = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-engine-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-engine-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-ethereum-forks = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-ethereum-forks = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-ethereum-payload-builder = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-ethereum-payload-builder = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-ethereum-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-ethereum-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-eth-wire = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-eth-wire = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-eth-wire-types = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-eth-wire-types = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-evm = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-evm = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-evm-ethereum = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-evm-ethereum = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-node-core = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-node-core = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-revm = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-revm = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-network = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-network = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-network-p2p = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-network-p2p = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-network-api = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-network-api = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-node-ethereum = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-node-ethereum = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-network-peers = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-network-peers = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-payload-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-payload-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-primitives = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-primitives-traits = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-primitives-traits = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-provider = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88", features = ["test-utils"] } reth-provider = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a", features = ["test-utils"] }
reth-rpc = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-rpc = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-rpc-eth-api = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-rpc-eth-api = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-rpc-engine-api = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-rpc-engine-api = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-tracing = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-tracing = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-trie-common = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-trie-common = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-trie-db = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-trie-db = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-codecs = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-codecs = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-transaction-pool = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-transaction-pool = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-stages-types = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-stages-types = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-storage-api = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-storage-api = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-errors = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-errors = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-rpc-convert = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-rpc-convert = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-rpc-eth-types = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-rpc-eth-types = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-rpc-server-types = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-rpc-server-types = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
reth-metrics = { git = "https://github.com/hl-archive-node/reth", rev = "83baf84bcb6d88081fc1b39f97733b8ec345cb88" } reth-metrics = { git = "https://github.com/hl-archive-node/reth", rev = "416c2e26756f1c8ee86e6b8e4081f434952b3a1a" }
revm = { version = "29.0.1", default-features = false } revm = { version = "29.0.1", default-features = false }
# alloy dependencies # alloy dependencies

View File

@ -19,61 +19,23 @@ use alloy_rpc_types::{
TransactionInfo, TransactionInfo,
pubsub::{Params, SubscriptionKind}, pubsub::{Params, SubscriptionKind},
}; };
use jsonrpsee::{PendingSubscriptionSink, SubscriptionMessage, SubscriptionSink, proc_macros::rpc}; use jsonrpsee::{PendingSubscriptionSink, proc_macros::rpc};
use jsonrpsee_core::{RpcResult, async_trait}; use jsonrpsee_core::{RpcResult, async_trait};
use jsonrpsee_types::{ErrorObject, error::INTERNAL_ERROR_CODE}; use jsonrpsee_types::{ErrorObject, error::INTERNAL_ERROR_CODE};
use reth::{api::FullNodeComponents, builder::rpc::RpcContext, tasks::TaskSpawner}; use reth::{api::FullNodeComponents, builder::rpc::RpcContext, tasks::TaskSpawner};
use reth_primitives_traits::{BlockBody as _, SignedTransaction}; use reth_primitives_traits::SignedTransaction;
use reth_provider::{BlockIdReader, BlockReader, BlockReaderIdExt, ReceiptProvider}; use reth_provider::{BlockIdReader, BlockReader, BlockReaderIdExt, ReceiptProvider};
use reth_rpc::{EthFilter, EthPubSub, RpcTypes, eth::pubsub::SubscriptionSerializeError}; use reth_rpc::{EthFilter, EthPubSub};
use reth_rpc_eth_api::{ use reth_rpc_eth_api::{
EthApiServer, EthApiTypes, EthFilterApiServer, EthPubSubApiServer, FullEthApiTypes, RpcBlock, EthApiTypes, EthFilterApiServer, EthPubSubApiServer, RpcBlock, RpcConvert, RpcReceipt,
RpcConvert, RpcHeader, RpcNodeCoreExt, RpcReceipt, RpcTransaction, RpcTxReq, RpcTransaction, helpers::EthBlocks, transaction::ConvertReceiptInput,
helpers::{EthBlocks, EthTransactions, LoadReceipt},
transaction::ConvertReceiptInput,
}; };
use serde::Serialize; use reth_rpc_eth_types::EthApiError;
use std::{marker::PhantomData, sync::Arc}; use std::{marker::PhantomData, sync::Arc};
use tokio_stream::{Stream, StreamExt}; use tokio_stream::StreamExt;
use tracing::{Instrument, trace}; use tracing::{Instrument, trace};
use crate::{HlBlock, node::primitives::HlPrimitives}; use crate::addons::utils::{EthWrapper, new_headers_stream, pipe_from_stream};
pub trait EthWrapper:
EthApiServer<
RpcTxReq<Self::NetworkTypes>,
RpcTransaction<Self::NetworkTypes>,
RpcBlock<Self::NetworkTypes>,
RpcReceipt<Self::NetworkTypes>,
RpcHeader<Self::NetworkTypes>,
> + FullEthApiTypes<
Primitives = HlPrimitives,
NetworkTypes: RpcTypes<TransactionResponse = alloy_rpc_types_eth::Transaction>,
> + RpcNodeCoreExt<Provider: BlockReader<Block = HlBlock>>
+ EthBlocks
+ EthTransactions
+ LoadReceipt
+ 'static
{
}
impl<T> EthWrapper for T where
T: EthApiServer<
RpcTxReq<Self::NetworkTypes>,
RpcTransaction<Self::NetworkTypes>,
RpcBlock<Self::NetworkTypes>,
RpcReceipt<Self::NetworkTypes>,
RpcHeader<Self::NetworkTypes>,
> + FullEthApiTypes<
Primitives = HlPrimitives,
NetworkTypes: RpcTypes<TransactionResponse = alloy_rpc_types_eth::Transaction>,
> + RpcNodeCoreExt<Provider: BlockReader<Block = HlBlock>>
+ EthBlocks
+ EthTransactions
+ LoadReceipt
+ 'static
{
}
#[rpc(server, namespace = "eth")] #[rpc(server, namespace = "eth")]
#[async_trait] #[async_trait]
@ -385,6 +347,8 @@ where
pubsub.log_stream(filter).filter_map(|log| adjust_log::<Eth>(log, &provider)), pubsub.log_stream(filter).filter_map(|log| adjust_log::<Eth>(log, &provider)),
) )
.await; .await;
} else if kind == SubscriptionKind::NewHeads {
let _ = pipe_from_stream(sink, new_headers_stream::<Eth>(&provider)).await;
} else { } else {
let _ = pubsub.handle_accepted(sink, kind, params).await; let _ = pubsub.handle_accepted(sink, kind, params).await;
} }
@ -411,23 +375,6 @@ fn adjust_log<Eth: EthWrapper>(mut log: Log, provider: &Eth::Provider) -> Option
Some(log) Some(log)
} }
async fn pipe_from_stream<T: Serialize, St: Stream<Item = T> + Unpin>(
sink: SubscriptionSink,
mut stream: St,
) -> Result<(), ErrorObject<'static>> {
loop {
tokio::select! {
_ = sink.closed() => break Ok(()),
maybe_item = stream.next() => {
let Some(item) = maybe_item else { break Ok(()) };
let msg = SubscriptionMessage::new(sink.method_name(), sink.subscription_id(), &item)
.map_err(SubscriptionSerializeError::from)?;
if sink.send(msg).await.is_err() { break Ok(()); }
}
}
}
}
pub struct HlNodeBlockFilterHttp<Eth: EthWrapper> { pub struct HlNodeBlockFilterHttp<Eth: EthWrapper> {
eth_api: Arc<Eth>, eth_api: Arc<Eth>,
_marker: PhantomData<Eth>, _marker: PhantomData<Eth>,
@ -578,9 +525,9 @@ async fn adjust_transaction_receipt<Eth: EthWrapper>(
// This function assumes that `block_id` is already validated by the caller. // This function assumes that `block_id` is already validated by the caller.
fn system_tx_count_for_block<Eth: EthWrapper>(eth_api: &Eth, block_id: BlockId) -> usize { fn system_tx_count_for_block<Eth: EthWrapper>(eth_api: &Eth, block_id: BlockId) -> usize {
let provider = eth_api.provider(); let provider = eth_api.provider();
let block = provider.block_by_id(block_id).unwrap().unwrap(); let header = provider.header_by_id(block_id).unwrap().unwrap();
block.body.transactions().iter().filter(|tx| tx.is_system_transaction()).count() header.extras.system_tx_count.try_into().unwrap()
} }
#[async_trait] #[async_trait]
@ -654,6 +601,9 @@ where
block_id: BlockId, block_id: BlockId,
) -> RpcResult<Option<Vec<RpcReceipt<Eth::NetworkTypes>>>> { ) -> RpcResult<Option<Vec<RpcReceipt<Eth::NetworkTypes>>>> {
trace!(target: "rpc::eth", ?block_id, "Serving eth_getBlockReceipts"); trace!(target: "rpc::eth", ?block_id, "Serving eth_getBlockReceipts");
if self.eth_api.provider().block_by_id(block_id).map_err(EthApiError::from)?.is_none() {
return Ok(None);
}
let result = let result =
adjust_block_receipts(block_id, &*self.eth_api).instrument(engine_span!()).await?; adjust_block_receipts(block_id, &*self.eth_api).instrument(engine_span!()).await?;
Ok(result.map(|(_, receipts)| receipts)) Ok(result.map(|(_, receipts)| receipts))

View File

@ -1,3 +1,5 @@
pub mod call_forwarder; pub mod call_forwarder;
pub mod hl_node_compliance; pub mod hl_node_compliance;
pub mod subscribe_fixup;
pub mod tx_forwarder; pub mod tx_forwarder;
mod utils;

View File

@ -0,0 +1,54 @@
use crate::addons::utils::{EthWrapper, new_headers_stream, pipe_from_stream};
use alloy_rpc_types::pubsub::{Params, SubscriptionKind};
use async_trait::async_trait;
use jsonrpsee::PendingSubscriptionSink;
use jsonrpsee_types::ErrorObject;
use reth::tasks::TaskSpawner;
use reth_rpc::EthPubSub;
use reth_rpc_convert::RpcTransaction;
use reth_rpc_eth_api::{EthApiTypes, EthPubSubApiServer};
use std::sync::Arc;
pub struct SubscribeFixup<Eth: EthWrapper> {
pubsub: Arc<EthPubSub<Eth>>,
provider: Arc<Eth::Provider>,
subscription_task_spawner: Box<dyn TaskSpawner + 'static>,
}
#[async_trait]
impl<Eth: EthWrapper> EthPubSubApiServer<RpcTransaction<Eth::NetworkTypes>> for SubscribeFixup<Eth>
where
ErrorObject<'static>: From<<Eth as EthApiTypes>::Error>,
{
async fn subscribe(
&self,
pending: PendingSubscriptionSink,
kind: SubscriptionKind,
params: Option<Params>,
) -> jsonrpsee::core::SubscriptionResult {
let sink = pending.accept().await?;
let (pubsub, provider) = (self.pubsub.clone(), self.provider.clone());
self.subscription_task_spawner.spawn(Box::pin(async move {
if kind == SubscriptionKind::NewHeads {
let _ = pipe_from_stream(sink, new_headers_stream::<Eth>(&provider)).await;
} else {
let _ = pubsub.handle_accepted(sink, kind, params).await;
}
}));
Ok(())
}
}
impl<Eth: EthWrapper> SubscribeFixup<Eth> {
pub fn new(
pubsub: Arc<EthPubSub<Eth>>,
provider: Arc<Eth::Provider>,
subscription_task_spawner: Box<dyn TaskSpawner + 'static>,
) -> Self
where
Eth: EthWrapper,
ErrorObject<'static>: From<Eth::Error>,
{
Self { pubsub, provider, subscription_task_spawner }
}
}

90
src/addons/utils.rs Normal file
View File

@ -0,0 +1,90 @@
use std::sync::Arc;
use crate::{HlBlock, HlPrimitives};
use alloy_primitives::U256;
use alloy_rpc_types::Header;
use futures::StreamExt;
use jsonrpsee::{SubscriptionMessage, SubscriptionSink};
use jsonrpsee_types::ErrorObject;
use reth_primitives::SealedHeader;
use reth_provider::{BlockReader, CanonStateSubscriptions};
use reth_rpc::{RpcTypes, eth::pubsub::SubscriptionSerializeError};
use reth_rpc_convert::{RpcBlock, RpcHeader, RpcReceipt, RpcTransaction, RpcTxReq};
use reth_rpc_eth_api::{
EthApiServer, FullEthApiTypes, RpcNodeCoreExt,
helpers::{EthBlocks, EthTransactions, LoadReceipt},
};
use serde::Serialize;
use tokio_stream::Stream;
pub trait EthWrapper:
EthApiServer<
RpcTxReq<Self::NetworkTypes>,
RpcTransaction<Self::NetworkTypes>,
RpcBlock<Self::NetworkTypes>,
RpcReceipt<Self::NetworkTypes>,
RpcHeader<Self::NetworkTypes>,
> + FullEthApiTypes<
Primitives = HlPrimitives,
NetworkTypes: RpcTypes<TransactionResponse = alloy_rpc_types_eth::Transaction>,
> + RpcNodeCoreExt<Provider: BlockReader<Block = HlBlock>>
+ EthBlocks
+ EthTransactions
+ LoadReceipt
+ 'static
{
}
impl<T> EthWrapper for T where
T: EthApiServer<
RpcTxReq<Self::NetworkTypes>,
RpcTransaction<Self::NetworkTypes>,
RpcBlock<Self::NetworkTypes>,
RpcReceipt<Self::NetworkTypes>,
RpcHeader<Self::NetworkTypes>,
> + FullEthApiTypes<
Primitives = HlPrimitives,
NetworkTypes: RpcTypes<TransactionResponse = alloy_rpc_types_eth::Transaction>,
> + RpcNodeCoreExt<Provider: BlockReader<Block = HlBlock>>
+ EthBlocks
+ EthTransactions
+ LoadReceipt
+ 'static
{
}
pub(super) async fn pipe_from_stream<T: Serialize, St: Stream<Item = T> + Unpin>(
sink: SubscriptionSink,
mut stream: St,
) -> Result<(), ErrorObject<'static>> {
loop {
tokio::select! {
_ = sink.closed() => break Ok(()),
maybe_item = stream.next() => {
let Some(item) = maybe_item else { break Ok(()) };
let msg = SubscriptionMessage::new(sink.method_name(), sink.subscription_id(), &item)
.map_err(SubscriptionSerializeError::from)?;
if sink.send(msg).await.is_err() { break Ok(()); }
}
}
}
}
pub(super) fn new_headers_stream<Eth: EthWrapper>(
provider: &Arc<Eth::Provider>,
) -> impl Stream<Item = Header<alloy_consensus::Header>> {
provider.canonical_state_stream().flat_map(|new_chain| {
let headers = new_chain
.committed()
.blocks_iter()
.map(|block| {
Header::from_consensus(
SealedHeader::new(block.header().inner.clone(), block.hash()).into(),
None,
Some(U256::from(block.rlp_length())),
)
})
.collect::<Vec<_>>();
futures::stream::iter(headers)
})
}

View File

@ -1,8 +1,10 @@
pub mod hl; pub mod hl;
pub mod parser; pub mod parser;
use crate::hardforks::HlHardforks; use crate::{
use alloy_consensus::Header; hardforks::HlHardforks,
node::primitives::{HlHeader, header::HlHeaderExtras},
};
use alloy_eips::eip7840::BlobParams; use alloy_eips::eip7840::BlobParams;
use alloy_genesis::Genesis; use alloy_genesis::Genesis;
use alloy_primitives::{Address, B256, U256}; use alloy_primitives::{Address, B256, U256};
@ -20,10 +22,11 @@ pub const TESTNET_CHAIN_ID: u64 = 998;
#[derive(Debug, Default, Clone, PartialEq, Eq)] #[derive(Debug, Default, Clone, PartialEq, Eq)]
pub struct HlChainSpec { pub struct HlChainSpec {
pub inner: ChainSpec, pub inner: ChainSpec,
pub genesis_header: HlHeader,
} }
impl EthChainSpec for HlChainSpec { impl EthChainSpec for HlChainSpec {
type Header = Header; type Header = HlHeader;
fn blob_params_at_timestamp(&self, timestamp: u64) -> Option<BlobParams> { fn blob_params_at_timestamp(&self, timestamp: u64) -> Option<BlobParams> {
self.inner.blob_params_at_timestamp(timestamp) self.inner.blob_params_at_timestamp(timestamp)
@ -57,8 +60,8 @@ impl EthChainSpec for HlChainSpec {
Box::new(self.inner.display_hardforks()) Box::new(self.inner.display_hardforks())
} }
fn genesis_header(&self) -> &Header { fn genesis_header(&self) -> &HlHeader {
self.inner.genesis_header() &self.genesis_header
} }
fn genesis(&self) -> &Genesis { fn genesis(&self) -> &Genesis {
@ -127,4 +130,10 @@ impl HlChainSpec {
_ => unreachable!("Unreachable since ChainSpecParser won't return other chains"), _ => unreachable!("Unreachable since ChainSpecParser won't return other chains"),
} }
} }
fn new(inner: ChainSpec) -> Self {
let genesis_header =
HlHeader { inner: inner.genesis_header().clone(), extras: HlHeaderExtras::default() };
Self { inner, genesis_header }
}
} }

View File

@ -26,8 +26,8 @@ impl ChainSpecParser for HlChainSpecParser {
/// Currently only mainnet is supported. /// Currently only mainnet is supported.
pub fn chain_value_parser(s: &str) -> eyre::Result<Arc<HlChainSpec>> { pub fn chain_value_parser(s: &str) -> eyre::Result<Arc<HlChainSpec>> {
match s { match s {
"mainnet" => Ok(Arc::new(HlChainSpec { inner: hl_mainnet() })), "mainnet" => Ok(Arc::new(HlChainSpec::new(hl_mainnet()))),
"testnet" => Ok(Arc::new(HlChainSpec { inner: hl_testnet() })), "testnet" => Ok(Arc::new(HlChainSpec::new(hl_testnet()))),
_ => Err(eyre::eyre!("Unsupported chain: {}", s)), _ => Err(eyre::eyre!("Unsupported chain: {}", s)),
} }
} }

View File

@ -7,36 +7,12 @@ use alloy_primitives::keccak256;
use revm::{ use revm::{
context::Host, context::Host,
interpreter::{ interpreter::{
InstructionContext, InterpreterTypes, as_u64_saturated, interpreter_types::StackTr, _count, InstructionContext, InterpreterTypes, as_u64_saturated, interpreter_types::StackTr,
popn_top, popn_top,
}, },
primitives::{BLOCK_HASH_HISTORY, U256}, primitives::{BLOCK_HASH_HISTORY, U256},
}; };
#[doc(hidden)]
#[macro_export]
#[collapse_debuginfo(yes)]
macro_rules! _count {
(@count) => { 0 };
(@count $head:tt $($tail:tt)*) => { 1 + _count!(@count $($tail)*) };
($($arg:tt)*) => { _count!(@count $($arg)*) };
}
/// Pops n values from the stack and returns the top value. Fails the instruction if n values can't
/// be popped.
#[macro_export]
#[collapse_debuginfo(yes)]
macro_rules! popn_top {
([ $($x:ident),* ], $top:ident, $interpreter:expr $(,$ret:expr)? ) => {
// Workaround for https://github.com/rust-lang/rust/issues/144329.
if $interpreter.stack.len() < (1 + $crate::_count!($($x)*)) {
$interpreter.halt_underflow();
return $($ret)?;
}
let ([$( $x ),*], $top) = unsafe { $interpreter.stack.popn_top().unwrap_unchecked() };
};
}
/// Implements the BLOCKHASH instruction. /// Implements the BLOCKHASH instruction.
/// ///
/// Gets the hash of one of the 256 most recent complete blocks. /// Gets the hash of one of the 256 most recent complete blocks.

View File

@ -7,4 +7,4 @@ pub mod node;
pub mod pseudo_peer; pub mod pseudo_peer;
pub mod version; pub mod version;
pub use node::primitives::{HlBlock, HlBlockBody, HlPrimitives}; pub use node::primitives::{HlBlock, HlBlockBody, HlHeader, HlPrimitives};

View File

@ -1,12 +1,16 @@
use std::sync::Arc; use std::sync::Arc;
use clap::Parser; use clap::Parser;
use reth::builder::{NodeBuilder, NodeHandle, WithLaunchContext}; use reth::{
builder::{NodeBuilder, NodeHandle, WithLaunchContext},
rpc::{api::EthPubSubApiServer, eth::RpcNodeCore},
};
use reth_db::DatabaseEnv; use reth_db::DatabaseEnv;
use reth_hl::{ use reth_hl::{
addons::{ addons::{
call_forwarder::{self, CallForwarderApiServer}, call_forwarder::{self, CallForwarderApiServer},
hl_node_compliance::install_hl_node_compliance, hl_node_compliance::install_hl_node_compliance,
subscribe_fixup::SubscribeFixup,
tx_forwarder::{self, EthForwarderApiServer}, tx_forwarder::{self, EthForwarderApiServer},
}, },
chainspec::{HlChainSpec, parser::HlChainSpecParser}, chainspec::{HlChainSpec, parser::HlChainSpecParser},
@ -14,7 +18,9 @@ use reth_hl::{
HlNode, HlNode,
cli::{Cli, HlNodeArgs}, cli::{Cli, HlNodeArgs},
rpc::precompile::{HlBlockPrecompileApiServer, HlBlockPrecompileExt}, rpc::precompile::{HlBlockPrecompileApiServer, HlBlockPrecompileExt},
spot_meta::init as spot_meta_init,
storage::tables::Tables, storage::tables::Tables,
types::set_spot_metadata_db,
}, },
}; };
use tracing::info; use tracing::info;
@ -35,8 +41,11 @@ fn main() -> eyre::Result<()> {
ext: HlNodeArgs| async move { ext: HlNodeArgs| async move {
let default_upstream_rpc_url = builder.config().chain.official_rpc_url(); let default_upstream_rpc_url = builder.config().chain.official_rpc_url();
let (node, engine_handle_tx) = let (node, engine_handle_tx) = HlNode::new(
HlNode::new(ext.block_source_args.parse().await?, ext.debug_cutoff_height); ext.block_source_args.parse().await?,
ext.debug_cutoff_height,
ext.allow_network_overrides,
);
let NodeHandle { node, node_exit_future: exit_future } = builder let NodeHandle { node, node_exit_future: exit_future } = builder
.node(node) .node(node)
.extend_rpc_modules(move |mut ctx| { .extend_rpc_modules(move |mut ctx| {
@ -59,6 +68,17 @@ fn main() -> eyre::Result<()> {
info!("Call/gas estimation will be forwarded to {}", upstream_rpc_url); info!("Call/gas estimation will be forwarded to {}", upstream_rpc_url);
} }
// This is a temporary workaround to fix the issue with custom headers
// affects `eth_subscribe[type=newHeads]`
ctx.modules.replace_configured(
SubscribeFixup::new(
Arc::new(ctx.registry.eth_handlers().pubsub.clone()),
Arc::new(ctx.registry.eth_api().provider().clone()),
Box::new(ctx.node().task_executor.clone()),
)
.into_rpc(),
)?;
if ext.hl_node_compliant { if ext.hl_node_compliant {
install_hl_node_compliance(&mut ctx)?; install_hl_node_compliance(&mut ctx)?;
info!("hl-node compliant mode enabled"); info!("hl-node compliant mode enabled");
@ -77,6 +97,16 @@ fn main() -> eyre::Result<()> {
}) })
.apply(|mut builder| { .apply(|mut builder| {
builder.db_mut().create_tables_for::<Tables>().expect("create tables"); builder.db_mut().create_tables_for::<Tables>().expect("create tables");
let chain_id = builder.config().chain.inner.chain().id();
let db = builder.db_mut().clone();
// Set database handle for on-demand persistence
set_spot_metadata_db(db.clone());
// Load spot metadata from database and initialize cache
spot_meta_init::load_spot_metadata_cache(&db, chain_id);
builder builder
}) })
.launch() .launch()

View File

@ -1,12 +1,15 @@
use crate::{ use crate::{
chainspec::{HlChainSpec, parser::HlChainSpecParser}, chainspec::{HlChainSpec, parser::HlChainSpecParser},
node::{HlNode, consensus::HlConsensus, evm::config::HlEvmConfig, storage::tables::Tables}, node::{
HlNode, consensus::HlConsensus, evm::config::HlEvmConfig, migrate::Migrator,
spot_meta::init as spot_meta_init, storage::tables::Tables,
},
pseudo_peer::BlockSourceArgs, pseudo_peer::BlockSourceArgs,
}; };
use clap::{Args, Parser}; use clap::{Args, Parser};
use reth::{ use reth::{
CliRunner, CliRunner,
args::LogArgs, args::{DatabaseArgs, DatadirArgs, LogArgs},
builder::{NodeBuilder, WithLaunchContext}, builder::{NodeBuilder, WithLaunchContext},
cli::Commands, cli::Commands,
prometheus_exporter::install_prometheus_recorder, prometheus_exporter::install_prometheus_recorder,
@ -79,6 +82,13 @@ pub struct HlNodeArgs {
/// * Refers to the Merkle trie used for eth_getProof and state root, not actual state values. /// * Refers to the Merkle trie used for eth_getProof and state root, not actual state values.
#[arg(long, env = "EXPERIMENTAL_ETH_GET_PROOF")] #[arg(long, env = "EXPERIMENTAL_ETH_GET_PROOF")]
pub experimental_eth_get_proof: bool, pub experimental_eth_get_proof: bool,
/// Allow network configuration overrides from CLI.
///
/// When enabled, network settings (discovery_addr, listener_addr, dns_discovery, nat)
/// will be taken from CLI arguments instead of being hardcoded to localhost-only defaults.
#[arg(long, env = "ALLOW_NETWORK_OVERRIDES")]
pub allow_network_overrides: bool,
} }
/// The main reth_hl cli interface. /// The main reth_hl cli interface.
@ -142,6 +152,12 @@ where
match self.command { match self.command {
Commands::Node(command) => runner.run_command_until_exit(|ctx| { Commands::Node(command) => runner.run_command_until_exit(|ctx| {
// NOTE: This is for one time migration around Oct 10 upgrade:
// It's not necessary anymore, an environment variable gate is added here.
if std::env::var("CHECK_DB_MIGRATION").is_ok() {
Self::migrate_db(&command.chain, &command.datadir, &command.db)
.expect("Failed to migrate database");
}
command.execute(ctx, FnLauncher::new::<C, Ext>(launcher)) command.execute(ctx, FnLauncher::new::<C, Ext>(launcher))
}), }),
Commands::Init(command) => { Commands::Init(command) => {
@ -185,7 +201,21 @@ where
let data_dir = env.datadir.clone().resolve_datadir(env.chain.chain()); let data_dir = env.datadir.clone().resolve_datadir(env.chain.chain());
let db_path = data_dir.db(); let db_path = data_dir.db();
init_db(db_path.clone(), env.db.database_args())?; init_db(db_path.clone(), env.db.database_args())?;
init_db_for::<_, Tables>(db_path, env.db.database_args())?; init_db_for::<_, Tables>(db_path.clone(), env.db.database_args())?;
// Initialize spot metadata in database
let chain_id = env.chain.chain().id();
spot_meta_init::init_spot_metadata(db_path, env.db.database_args(), chain_id)?;
Ok(())
}
fn migrate_db(
chain: &HlChainSpec,
datadir: &DatadirArgs,
db: &DatabaseArgs,
) -> eyre::Result<()> {
Migrator::<HlNode>::new(chain.clone(), datadir.clone(), *db)?.migrate_db()?;
Ok(()) Ok(())
} }
} }

View File

@ -1,5 +1,8 @@
use crate::{HlBlock, HlBlockBody, HlPrimitives, hardforks::HlHardforks, node::HlNode}; use crate::{
use alloy_consensus::Header; HlBlock, HlBlockBody, HlPrimitives,
hardforks::HlHardforks,
node::{HlNode, primitives::HlHeader},
};
use reth::{ use reth::{
api::{FullNodeTypes, NodeTypes}, api::{FullNodeTypes, NodeTypes},
beacon_consensus::EthBeaconConsensus, beacon_consensus::EthBeaconConsensus,
@ -101,14 +104,14 @@ where
impl<ChainSpec> Consensus<HlBlock> for HlConsensus<ChainSpec> impl<ChainSpec> Consensus<HlBlock> for HlConsensus<ChainSpec>
where where
ChainSpec: EthChainSpec<Header = Header> + HlHardforks, ChainSpec: EthChainSpec<Header = HlHeader> + HlHardforks,
{ {
type Error = ConsensusError; type Error = ConsensusError;
fn validate_body_against_header( fn validate_body_against_header(
&self, &self,
body: &HlBlockBody, body: &HlBlockBody,
header: &SealedHeader, header: &SealedHeader<HlHeader>,
) -> Result<(), ConsensusError> { ) -> Result<(), ConsensusError> {
Consensus::<HlBlock>::validate_body_against_header(&self.inner, body, header) Consensus::<HlBlock>::validate_body_against_header(&self.inner, body, header)
} }
@ -148,7 +151,7 @@ mod reth_copy;
impl<ChainSpec> FullConsensus<HlPrimitives> for HlConsensus<ChainSpec> impl<ChainSpec> FullConsensus<HlPrimitives> for HlConsensus<ChainSpec>
where where
ChainSpec: EthChainSpec<Header = Header> + HlHardforks, ChainSpec: EthChainSpec<Header = HlHeader> + HlHardforks,
{ {
fn validate_block_post_execution( fn validate_block_post_execution(
&self, &self,

View File

@ -1,21 +1,21 @@
//! Copy of reth codebase. //! Copy of reth codebase.
use crate::HlBlock;
use alloy_consensus::{BlockHeader, TxReceipt, proofs::calculate_receipt_root}; use alloy_consensus::{BlockHeader, TxReceipt, proofs::calculate_receipt_root};
use alloy_eips::eip7685::Requests; use alloy_eips::eip7685::Requests;
use alloy_primitives::{B256, Bloom}; use alloy_primitives::{B256, Bloom};
use reth::consensus::ConsensusError; use reth::consensus::ConsensusError;
use reth_chainspec::EthereumHardforks; use reth_chainspec::EthereumHardforks;
use reth_primitives::{GotExpected, RecoveredBlock, gas_spent_by_transactions}; use reth_primitives::{GotExpected, RecoveredBlock, gas_spent_by_transactions};
use reth_primitives_traits::{Block, Receipt as ReceiptTrait}; use reth_primitives_traits::Receipt as ReceiptTrait;
pub fn validate_block_post_execution<B, R, ChainSpec>( pub fn validate_block_post_execution<R, ChainSpec>(
block: &RecoveredBlock<B>, block: &RecoveredBlock<HlBlock>,
chain_spec: &ChainSpec, chain_spec: &ChainSpec,
receipts: &[R], receipts: &[R],
requests: &Requests, requests: &Requests,
) -> Result<(), ConsensusError> ) -> Result<(), ConsensusError>
where where
B: Block,
R: ReceiptTrait, R: ReceiptTrait,
ChainSpec: EthereumHardforks, ChainSpec: EthereumHardforks,
{ {
@ -42,7 +42,7 @@ where
receipts.iter().filter(|&r| r.cumulative_gas_used() != 0).cloned().collect::<Vec<_>>(); receipts.iter().filter(|&r| r.cumulative_gas_used() != 0).cloned().collect::<Vec<_>>();
if let Err(error) = verify_receipts( if let Err(error) = verify_receipts(
block.header().receipts_root(), block.header().receipts_root(),
block.header().logs_bloom(), block.header().inner.logs_bloom(),
&receipts_for_root, &receipts_for_root,
) { ) {
tracing::debug!(%error, ?receipts, "receipts verification failed"); tracing::debug!(%error, ?receipts, "receipts verification failed");

View File

@ -1,8 +1,7 @@
use crate::{ use crate::{
HlBlock, HlBlock, HlHeader,
node::evm::config::{HlBlockExecutorFactory, HlEvmConfig}, node::evm::config::{HlBlockExecutorFactory, HlEvmConfig},
}; };
use alloy_consensus::Header;
use reth_evm::{ use reth_evm::{
block::BlockExecutionError, block::BlockExecutionError,
execute::{BlockAssembler, BlockAssemblerInput}, execute::{BlockAssembler, BlockAssemblerInput},
@ -13,7 +12,7 @@ impl BlockAssembler<HlBlockExecutorFactory> for HlEvmConfig {
fn assemble_block( fn assemble_block(
&self, &self,
input: BlockAssemblerInput<'_, '_, HlBlockExecutorFactory, Header>, input: BlockAssemblerInput<'_, '_, HlBlockExecutorFactory, HlHeader>,
) -> Result<Self::Block, BlockExecutionError> { ) -> Result<Self::Block, BlockExecutionError> {
let HlBlock { header, body } = self.block_assembler.assemble_block(input)?; let HlBlock { header, body } = self.block_assembler.assemble_block(input)?;
Ok(HlBlock { header, body }) Ok(HlBlock { header, body })

View File

@ -1,6 +1,6 @@
use super::{executor::HlBlockExecutor, factory::HlEvmFactory}; use super::{executor::HlBlockExecutor, factory::HlEvmFactory};
use crate::{ use crate::{
HlBlock, HlBlockBody, HlPrimitives, HlBlock, HlBlockBody, HlHeader, HlPrimitives,
chainspec::HlChainSpec, chainspec::HlChainSpec,
evm::{spec::HlSpecId, transaction::HlTxEnv}, evm::{spec::HlSpecId, transaction::HlTxEnv},
hardforks::HlHardforks, hardforks::HlHardforks,
@ -54,7 +54,7 @@ where
fn assemble_block( fn assemble_block(
&self, &self,
input: BlockAssemblerInput<'_, '_, F>, input: BlockAssemblerInput<'_, '_, F, HlHeader>,
) -> Result<Self::Block, BlockExecutionError> { ) -> Result<Self::Block, BlockExecutionError> {
// TODO: Copy of EthBlockAssembler::assemble_block // TODO: Copy of EthBlockAssembler::assemble_block
let inner = &self.inner; let inner = &self.inner;
@ -136,6 +136,9 @@ where
excess_blob_gas, excess_blob_gas,
requests_hash, requests_hash,
}; };
let system_tx_count =
transactions.iter().filter(|t| is_system_transaction(t)).count() as u64;
let header = HlHeader::from_ethereum_header(header, receipts, system_tx_count);
Ok(Self::Block { Ok(Self::Block {
header, header,
@ -269,6 +272,8 @@ where
} }
} }
static EMPTY_OMMERS: [Header; 0] = [];
impl ConfigureEvm for HlEvmConfig impl ConfigureEvm for HlEvmConfig
where where
Self: Send + Sync + Unpin + Clone + 'static, Self: Send + Sync + Unpin + Clone + 'static,
@ -287,7 +292,7 @@ where
self self
} }
fn evm_env(&self, header: &Header) -> Result<EvmEnv<HlSpecId>, Self::Error> { fn evm_env(&self, header: &HlHeader) -> Result<EvmEnv<HlSpecId>, Self::Error> {
let blob_params = self.chain_spec().blob_params_at_timestamp(header.timestamp); let blob_params = self.chain_spec().blob_params_at_timestamp(header.timestamp);
let spec = revm_spec_by_timestamp_and_block_number( let spec = revm_spec_by_timestamp_and_block_number(
self.chain_spec().clone(), self.chain_spec().clone(),
@ -332,7 +337,7 @@ where
fn next_evm_env( fn next_evm_env(
&self, &self,
parent: &Header, parent: &HlHeader,
attributes: &Self::NextBlockEnvCtx, attributes: &Self::NextBlockEnvCtx,
) -> Result<EvmEnv<HlSpecId>, Self::Error> { ) -> Result<EvmEnv<HlSpecId>, Self::Error> {
// ensure we're not missing any timestamp based hardforks // ensure we're not missing any timestamp based hardforks
@ -382,7 +387,7 @@ where
ctx: EthBlockExecutionCtx { ctx: EthBlockExecutionCtx {
parent_hash: block.header().parent_hash, parent_hash: block.header().parent_hash,
parent_beacon_block_root: block.header().parent_beacon_block_root, parent_beacon_block_root: block.header().parent_beacon_block_root,
ommers: &block.body().ommers, ommers: &EMPTY_OMMERS,
withdrawals: block.body().withdrawals.as_ref().map(Cow::Borrowed), withdrawals: block.body().withdrawals.as_ref().map(Cow::Borrowed),
}, },
extras: HlExtras { extras: HlExtras {
@ -420,7 +425,7 @@ impl ConfigureEngineEvm<HlExecutionData> for HlEvmConfig {
ctx: EthBlockExecutionCtx { ctx: EthBlockExecutionCtx {
parent_hash: block.header.parent_hash, parent_hash: block.header.parent_hash,
parent_beacon_block_root: block.header.parent_beacon_block_root, parent_beacon_block_root: block.header.parent_beacon_block_root,
ommers: &block.body.ommers, ommers: &EMPTY_OMMERS,
withdrawals: block.body.withdrawals.as_ref().map(Cow::Borrowed), withdrawals: block.body.withdrawals.as_ref().map(Cow::Borrowed),
}, },
extras: HlExtras { extras: HlExtras {

429
src/node/migrate.rs Normal file
View File

@ -0,0 +1,429 @@
use alloy_consensus::Header;
use alloy_primitives::{B256, BlockHash, Bytes, U256, b256, hex::ToHexExt};
use reth::{
api::NodeTypesWithDBAdapter,
args::{DatabaseArgs, DatadirArgs},
dirs::{ChainPath, DataDirPath},
};
use reth_chainspec::EthChainSpec;
use reth_db::{
DatabaseEnv,
mdbx::{RO, tx::Tx},
models::CompactU256,
static_file::iter_static_files,
table::Decompress,
tables,
};
use reth_db_api::{
cursor::{DbCursorRO, DbCursorRW},
transaction::{DbTx, DbTxMut},
};
use reth_errors::ProviderResult;
use reth_ethereum_primitives::EthereumReceipt;
use reth_provider::{
DatabaseProvider, ProviderFactory, ReceiptProvider, StaticFileProviderFactory,
StaticFileSegment, StaticFileWriter,
providers::{NodeTypesForProvider, StaticFileProvider},
static_file::SegmentRangeInclusive,
};
use std::{fs::File, io::Write, path::PathBuf, sync::Arc};
use tracing::{info, warn};
use crate::{HlHeader, HlPrimitives, chainspec::HlChainSpec};
pub(crate) trait HlNodeType:
NodeTypesForProvider<ChainSpec = HlChainSpec, Primitives = HlPrimitives>
{
}
impl<N: NodeTypesForProvider<ChainSpec = HlChainSpec, Primitives = HlPrimitives>> HlNodeType for N {}
pub(super) struct Migrator<N: HlNodeType> {
data_dir: ChainPath<DataDirPath>,
provider_factory: ProviderFactory<NodeTypesWithDBAdapter<N, Arc<DatabaseEnv>>>,
}
impl<N: HlNodeType> Migrator<N> {
const MIGRATION_PATH_SUFFIX: &'static str = "migration-tmp";
pub fn new(
chain_spec: HlChainSpec,
datadir: DatadirArgs,
database_args: DatabaseArgs,
) -> eyre::Result<Self> {
let data_dir = datadir.clone().resolve_datadir(chain_spec.chain());
let provider_factory = Self::provider_factory(chain_spec, datadir, database_args)?;
Ok(Self { data_dir, provider_factory })
}
pub fn sf_provider(&self) -> StaticFileProvider<HlPrimitives> {
self.provider_factory.static_file_provider()
}
pub fn migrate_db(&self) -> eyre::Result<()> {
let is_empty = Self::highest_block_number(&self.sf_provider()).is_none();
if is_empty {
return Ok(());
}
self.migrate_db_inner()
}
fn highest_block_number(sf_provider: &StaticFileProvider<HlPrimitives>) -> Option<u64> {
sf_provider.get_highest_static_file_block(StaticFileSegment::Headers)
}
fn migrate_db_inner(&self) -> eyre::Result<()> {
let migrated_mdbx = MigratorMdbx::<N>(self).migrate_mdbx()?;
let migrated_static_files = MigrateStaticFiles::<N>(self).migrate_static_files()?;
if migrated_mdbx || migrated_static_files {
info!("Database migrated successfully");
}
Ok(())
}
fn conversion_tmp_dir(&self) -> PathBuf {
self.data_dir.data_dir().join(Self::MIGRATION_PATH_SUFFIX)
}
fn provider_factory(
chain_spec: HlChainSpec,
datadir: DatadirArgs,
database_args: DatabaseArgs,
) -> eyre::Result<ProviderFactory<NodeTypesWithDBAdapter<N, Arc<DatabaseEnv>>>> {
let data_dir = datadir.clone().resolve_datadir(chain_spec.chain());
let db_env = reth_db::init_db(data_dir.db(), database_args.database_args())?;
let static_file_provider = StaticFileProvider::read_only(data_dir.static_files(), false)?;
let db = Arc::new(db_env);
Ok(ProviderFactory::new(db, Arc::new(chain_spec), static_file_provider))
}
}
struct MigratorMdbx<'a, N: HlNodeType>(&'a Migrator<N>);
impl<'a, N: HlNodeType> MigratorMdbx<'a, N> {
fn migrate_mdbx(&self) -> eyre::Result<bool> {
// if any header is in old format, we need to migrate it, so we pick the first and last one
let db_env = self.0.provider_factory.provider()?;
let mut cursor = db_env.tx_ref().cursor_read::<tables::Headers<Bytes>>()?;
let migration_needed = {
let first_is_old = match cursor.first()? {
Some((number, header)) => using_old_header(number, &header),
None => false,
};
let last_is_old = match cursor.last()? {
Some((number, header)) => using_old_header(number, &header),
None => false,
};
first_is_old || last_is_old
};
if !migration_needed {
return Ok(false);
}
check_if_migration_enabled()?;
self.migrate_mdbx_inner()?;
Ok(true)
}
fn migrate_mdbx_inner(&self) -> eyre::Result<()> {
// There shouldn't be many headers in mdbx, but using file for safety
info!("Old database detected, migrating mdbx...");
let conversion_tmp = self.0.conversion_tmp_dir();
let tmp_path = conversion_tmp.join("headers.rmp");
if conversion_tmp.exists() {
std::fs::remove_dir_all(&conversion_tmp)?;
}
std::fs::create_dir_all(&conversion_tmp)?;
let count = self.export_old_headers(&tmp_path)?;
self.import_new_headers(tmp_path, count)?;
Ok(())
}
fn export_old_headers(&self, tmp_path: &PathBuf) -> Result<i32, eyre::Error> {
let db_env = self.0.provider_factory.provider()?;
let mut cursor_read = db_env.tx_ref().cursor_read::<tables::Headers<Bytes>>()?;
let mut tmp_writer = File::create(tmp_path)?;
let mut count = 0;
let old_headers = cursor_read.walk(None)?.filter_map(|row| {
let (block_number, header) = row.ok()?;
if !using_old_header(block_number, &header) {
None
} else {
Some((block_number, Header::decompress(&header).ok()?))
}
});
for (block_number, header) in old_headers {
let receipt =
db_env.receipts_by_block(block_number.into())?.expect("Receipt not found");
let new_header = to_hl_header(receipt, header);
tmp_writer.write_all(&rmp_serde::to_vec(&(block_number, new_header))?)?;
count += 1;
}
Ok(count)
}
fn import_new_headers(&self, tmp_path: PathBuf, count: i32) -> Result<(), eyre::Error> {
let mut tmp_reader = File::open(tmp_path)?;
let db_env = self.0.provider_factory.provider_rw()?;
let mut cursor_write = db_env.tx_ref().cursor_write::<tables::Headers<Bytes>>()?;
for _ in 0..count {
let (number, header) = rmp_serde::from_read::<_, (u64, HlHeader)>(&mut tmp_reader)?;
cursor_write.upsert(number, &rmp_serde::to_vec(&header)?.into())?;
}
db_env.commit()?;
Ok(())
}
}
fn check_if_migration_enabled() -> Result<(), eyre::Error> {
if std::env::var("EXPERIMENTAL_MIGRATE_DB").is_err() {
let err_msg = concat!(
"Detected an old database format but experimental database migration is currently disabled. ",
"To enable migration, set EXPERIMENTAL_MIGRATE_DB=1, or alternatively, resync your node (safest option)."
);
warn!("{}", err_msg);
return Err(eyre::eyre!("{}", err_msg));
}
Ok(())
}
struct MigrateStaticFiles<'a, N: HlNodeType>(&'a Migrator<N>);
impl<'a, N: HlNodeType> MigrateStaticFiles<'a, N> {
fn iterate_files_for_segment(
&self,
block_range: SegmentRangeInclusive,
dir: &PathBuf,
) -> eyre::Result<Vec<(PathBuf, String)>> {
let prefix = StaticFileSegment::Headers.filename(&block_range);
let entries = std::fs::read_dir(dir)?
.map(|res| res.map(|e| e.path()))
.collect::<Result<Vec<_>, _>>()?;
Ok(entries
.into_iter()
.filter_map(|path| {
let file_name = path.file_name().and_then(|f| f.to_str())?;
if file_name.starts_with(&prefix) {
Some((path.clone(), file_name.to_string()))
} else {
None
}
})
.collect())
}
fn create_placeholder(&self, block_range: SegmentRangeInclusive) -> eyre::Result<()> {
// The direction is opposite here
let src = self.0.data_dir.static_files();
let dst = self.0.conversion_tmp_dir();
for (src_path, file_name) in self.iterate_files_for_segment(block_range, &src)? {
let dst_path = dst.join(file_name);
if dst_path.exists() {
std::fs::remove_file(&dst_path)?;
}
std::os::unix::fs::symlink(src_path, dst_path)?;
}
Ok(())
}
fn move_static_files_for_segment(
&self,
block_range: SegmentRangeInclusive,
) -> eyre::Result<()> {
let src = self.0.conversion_tmp_dir();
let dst = self.0.data_dir.static_files();
for (src_path, file_name) in self.iterate_files_for_segment(block_range, &src)? {
let dst_path = dst.join(file_name);
std::fs::remove_file(&dst_path)?;
std::fs::rename(&src_path, &dst_path)?;
}
// Still StaticFileProvider needs the file to exist, so we create a symlink
self.create_placeholder(block_range)
}
fn migrate_static_files(&self) -> eyre::Result<bool> {
let conversion_tmp = self.0.conversion_tmp_dir();
let old_path = self.0.data_dir.static_files();
if conversion_tmp.exists() {
std::fs::remove_dir_all(&conversion_tmp)?;
}
std::fs::create_dir_all(&conversion_tmp)?;
let mut all_static_files = iter_static_files(&old_path)?;
let all_static_files =
all_static_files.remove(&StaticFileSegment::Headers).unwrap_or_default();
let mut first = true;
for (block_range, _tx_ranges) in all_static_files {
let migration_needed = self.using_old_header(block_range.start())? ||
self.using_old_header(block_range.end())?;
if !migration_needed {
// Create a placeholder symlink
self.create_placeholder(block_range)?;
continue;
}
if first {
check_if_migration_enabled()?;
info!("Old database detected, migrating static files...");
first = false;
}
let sf_provider = self.0.sf_provider();
let sf_tmp_provider = StaticFileProvider::<HlPrimitives>::read_write(&conversion_tmp)?;
let provider = self.0.provider_factory.provider()?;
let block_range_for_filename = sf_provider.find_fixed_range(block_range.start());
migrate_single_static_file(&sf_tmp_provider, &sf_provider, &provider, block_range)?;
self.move_static_files_for_segment(block_range_for_filename)?;
}
Ok(!first)
}
fn using_old_header(&self, number: u64) -> eyre::Result<bool> {
let sf_provider = self.0.sf_provider();
let content = old_headers_range(&sf_provider, number..=number)?;
let &[row] = &content.as_slice() else {
warn!("No header found for block {}", number);
return Ok(false);
};
Ok(using_old_header(number, &row[0]))
}
}
// Problem is that decompress just panics when the header is not valid
// So we need heuristics...
fn is_old_header(header: &[u8]) -> bool {
const SHA3_UNCLE_OFFSET: usize = 0x24;
const SHA3_UNCLE_HASH: B256 =
b256!("1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347");
const GENESIS_PREFIX: [u8; 4] = [0x01, 0x20, 0x00, 0xf8];
let Some(sha3_uncle_hash) = header.get(SHA3_UNCLE_OFFSET..SHA3_UNCLE_OFFSET + 32) else {
return false;
};
if sha3_uncle_hash == SHA3_UNCLE_HASH {
return true;
}
// genesis block might be different
if header.starts_with(&GENESIS_PREFIX) {
return true;
}
false
}
fn is_new_header(header: &[u8]) -> bool {
rmp_serde::from_slice::<HlHeader>(header).is_ok()
}
fn migrate_single_static_file<N: HlNodeType>(
sf_out: &StaticFileProvider<HlPrimitives>,
sf_in: &StaticFileProvider<HlPrimitives>,
provider: &DatabaseProvider<Tx<RO>, NodeTypesWithDBAdapter<N, Arc<DatabaseEnv>>>,
block_range: SegmentRangeInclusive,
) -> Result<(), eyre::Error> {
info!("Migrating block range {}...", block_range);
// block_ranges into chunks of 50000 blocks
const CHUNK_SIZE: u64 = 50000;
for chunk in (block_range.start()..=block_range.end()).step_by(CHUNK_SIZE as usize) {
let end = std::cmp::min(chunk + CHUNK_SIZE - 1, block_range.end());
let block_range = chunk..=end;
let headers = old_headers_range(sf_in, block_range.clone())?;
let receipts = provider.receipts_by_block_range(block_range.clone())?;
assert_eq!(headers.len(), receipts.len());
let mut writer = sf_out.get_writer(*block_range.start(), StaticFileSegment::Headers)?;
let new_headers = std::iter::zip(headers, receipts)
.map(|(header, receipts)| {
let eth_header = Header::decompress(&header[0]).unwrap();
let hl_header = to_hl_header(receipts, eth_header);
let difficulty: U256 = CompactU256::decompress(&header[1]).unwrap().into();
let hash = BlockHash::decompress(&header[2]).unwrap();
(hl_header, difficulty, hash)
})
.collect::<Vec<_>>();
for header in new_headers {
writer.append_header(&header.0, header.1, &header.2)?;
}
writer.commit().unwrap();
info!("Migrated block range {:?}...", block_range);
}
Ok(())
}
fn to_hl_header(receipts: Vec<EthereumReceipt>, eth_header: Header) -> HlHeader {
let system_tx_count = receipts.iter().filter(|r| r.cumulative_gas_used == 0).count();
HlHeader::from_ethereum_header(eth_header, &receipts, system_tx_count as u64)
}
fn old_headers_range(
provider: &StaticFileProvider<HlPrimitives>,
block_range: impl std::ops::RangeBounds<u64>,
) -> ProviderResult<Vec<Vec<Vec<u8>>>> {
Ok(provider
.fetch_range_with_predicate(
StaticFileSegment::Headers,
to_range(block_range),
|cursor, number| {
cursor.get(number.into(), 0b111).map(|rows| {
rows.map(|columns| columns.into_iter().map(|column| column.to_vec()).collect())
})
},
|_| true,
)?
.into_iter()
.collect())
}
// Copied from reth
fn to_range<R: std::ops::RangeBounds<u64>>(bounds: R) -> std::ops::Range<u64> {
let start = match bounds.start_bound() {
std::ops::Bound::Included(&v) => v,
std::ops::Bound::Excluded(&v) => v + 1,
std::ops::Bound::Unbounded => 0,
};
let end = match bounds.end_bound() {
std::ops::Bound::Included(&v) => v + 1,
std::ops::Bound::Excluded(&v) => v,
std::ops::Bound::Unbounded => u64::MAX,
};
start..end
}
fn using_old_header(number: u64, header: &[u8]) -> bool {
let deserialized_old = is_old_header(header);
let deserialized_new = is_new_header(header);
assert!(
deserialized_old ^ deserialized_new,
"Header is not valid: {} {}\ndeserialized_old: {}\ndeserialized_new: {}",
number,
header.encode_hex(),
deserialized_old,
deserialized_new
);
deserialized_old && !deserialized_new
}

View File

@ -33,6 +33,7 @@ pub mod cli;
pub mod consensus; pub mod consensus;
pub mod engine; pub mod engine;
pub mod evm; pub mod evm;
pub mod migrate;
pub mod network; pub mod network;
pub mod primitives; pub mod primitives;
pub mod rpc; pub mod rpc;
@ -50,12 +51,14 @@ pub struct HlNode {
engine_handle_rx: Arc<Mutex<Option<oneshot::Receiver<ConsensusEngineHandle<HlPayloadTypes>>>>>, engine_handle_rx: Arc<Mutex<Option<oneshot::Receiver<ConsensusEngineHandle<HlPayloadTypes>>>>>,
block_source_config: BlockSourceConfig, block_source_config: BlockSourceConfig,
debug_cutoff_height: Option<u64>, debug_cutoff_height: Option<u64>,
allow_network_overrides: bool,
} }
impl HlNode { impl HlNode {
pub fn new( pub fn new(
block_source_config: BlockSourceConfig, block_source_config: BlockSourceConfig,
debug_cutoff_height: Option<u64>, debug_cutoff_height: Option<u64>,
allow_network_overrides: bool,
) -> (Self, oneshot::Sender<ConsensusEngineHandle<HlPayloadTypes>>) { ) -> (Self, oneshot::Sender<ConsensusEngineHandle<HlPayloadTypes>>) {
let (tx, rx) = oneshot::channel(); let (tx, rx) = oneshot::channel();
( (
@ -63,6 +66,7 @@ impl HlNode {
engine_handle_rx: Arc::new(Mutex::new(Some(rx))), engine_handle_rx: Arc::new(Mutex::new(Some(rx))),
block_source_config, block_source_config,
debug_cutoff_height, debug_cutoff_height,
allow_network_overrides,
}, },
tx, tx,
) )
@ -94,6 +98,7 @@ impl HlNode {
engine_handle_rx: self.engine_handle_rx.clone(), engine_handle_rx: self.engine_handle_rx.clone(),
block_source_config: self.block_source_config.clone(), block_source_config: self.block_source_config.clone(),
debug_cutoff_height: self.debug_cutoff_height, debug_cutoff_height: self.debug_cutoff_height,
allow_network_overrides: self.allow_network_overrides,
}) })
.consensus(HlConsensusBuilder::default()) .consensus(HlConsensusBuilder::default())
} }

View File

@ -179,7 +179,7 @@ where
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use crate::chainspec::hl::hl_mainnet; use crate::{HlHeader, chainspec::hl::hl_mainnet};
use super::*; use super::*;
use alloy_primitives::{B256, U128}; use alloy_primitives::{B256, U128};
@ -355,7 +355,7 @@ mod tests {
/// Creates a test block message /// Creates a test block message
fn create_test_block() -> NewBlockMessage<HlNewBlock> { fn create_test_block() -> NewBlockMessage<HlNewBlock> {
let block = HlBlock { let block = HlBlock {
header: Header::default(), header: HlHeader::default(),
body: HlBlockBody { body: HlBlockBody {
inner: BlockBody { inner: BlockBody {
transactions: Vec::new(), transactions: Vec::new(),

View File

@ -25,7 +25,10 @@ use reth_network::{NetworkConfig, NetworkHandle, NetworkManager};
use reth_network_api::PeersInfo; use reth_network_api::PeersInfo;
use reth_provider::StageCheckpointReader; use reth_provider::StageCheckpointReader;
use reth_stages_types::StageId; use reth_stages_types::StageId;
use std::sync::Arc; use std::{
net::{Ipv4Addr, SocketAddr},
sync::Arc,
};
use tokio::sync::{Mutex, mpsc, oneshot}; use tokio::sync::{Mutex, mpsc, oneshot};
use tracing::info; use tracing::info;
@ -38,10 +41,10 @@ pub struct HlNewBlock(pub NewBlock<HlBlock>);
mod rlp { mod rlp {
use super::*; use super::*;
use crate::{ use crate::{
HlBlockBody, HlBlockBody, HlHeader,
node::primitives::{BlockBody, TransactionSigned}, node::primitives::{BlockBody, TransactionSigned},
}; };
use alloy_consensus::{BlobTransactionSidecar, Header}; use alloy_consensus::BlobTransactionSidecar;
use alloy_primitives::{Address, U128}; use alloy_primitives::{Address, U128};
use alloy_rlp::{RlpDecodable, RlpEncodable}; use alloy_rlp::{RlpDecodable, RlpEncodable};
use alloy_rpc_types::Withdrawals; use alloy_rpc_types::Withdrawals;
@ -50,9 +53,9 @@ mod rlp {
#[derive(RlpEncodable, RlpDecodable)] #[derive(RlpEncodable, RlpDecodable)]
#[rlp(trailing)] #[rlp(trailing)]
struct BlockHelper<'a> { struct BlockHelper<'a> {
header: Cow<'a, Header>, header: Cow<'a, HlHeader>,
transactions: Cow<'a, Vec<TransactionSigned>>, transactions: Cow<'a, Vec<TransactionSigned>>,
ommers: Cow<'a, Vec<Header>>, ommers: Cow<'a, Vec<HlHeader>>,
withdrawals: Option<Cow<'a, Withdrawals>>, withdrawals: Option<Cow<'a, Withdrawals>>,
} }
@ -144,6 +147,8 @@ pub struct HlNetworkBuilder {
pub(crate) block_source_config: BlockSourceConfig, pub(crate) block_source_config: BlockSourceConfig,
pub(crate) debug_cutoff_height: Option<u64>, pub(crate) debug_cutoff_height: Option<u64>,
pub(crate) allow_network_overrides: bool,
} }
impl HlNetworkBuilder { impl HlNetworkBuilder {
@ -174,15 +179,24 @@ impl HlNetworkBuilder {
ImportService::new(consensus, handle, from_network, to_network).await.unwrap(); ImportService::new(consensus, handle, from_network, to_network).await.unwrap();
}); });
Ok(ctx.build_network_config( let mut config_builder = ctx.network_config_builder()?;
ctx.network_config_builder()?
// Only apply localhost-only network settings if network overrides are NOT allowed
if !self.allow_network_overrides {
config_builder = config_builder
.discovery_addr(SocketAddr::new(Ipv4Addr::LOCALHOST.into(), 0))
.listener_addr(SocketAddr::new(Ipv4Addr::LOCALHOST.into(), 0))
.disable_dns_discovery() .disable_dns_discovery()
.disable_nat() .disable_nat();
}
config_builder = config_builder
.boot_nodes(boot_nodes()) .boot_nodes(boot_nodes())
.set_head(ctx.head()) .set_head(ctx.head())
.with_pow() .with_pow()
.block_import(Box::new(HlBlockImport::new(handle))), .block_import(Box::new(HlBlockImport::new(handle)));
))
Ok(ctx.build_network_config(config_builder))
} }
} }

View File

@ -0,0 +1,49 @@
use super::{HlBlockBody, HlHeader, rlp};
use alloy_rlp::Encodable;
use reth_primitives_traits::{Block, InMemorySize};
use serde::{Deserialize, Serialize};
use std::borrow::Cow;
/// Block for HL
#[derive(Debug, Clone, Default, PartialEq, Eq, Serialize, Deserialize)]
pub struct HlBlock {
pub header: HlHeader,
pub body: HlBlockBody,
}
impl InMemorySize for HlBlock {
fn size(&self) -> usize {
self.header.size() + self.body.size()
}
}
impl Block for HlBlock {
type Header = HlHeader;
type Body = HlBlockBody;
fn new(header: Self::Header, body: Self::Body) -> Self {
Self { header, body }
}
fn header(&self) -> &Self::Header {
&self.header
}
fn body(&self) -> &Self::Body {
&self.body
}
fn split(self) -> (Self::Header, Self::Body) {
(self.header, self.body)
}
fn rlp_length(header: &Self::Header, body: &Self::Body) -> usize {
rlp::BlockHelper {
header: Cow::Borrowed(header),
transactions: Cow::Borrowed(&body.inner.transactions),
ommers: Cow::Borrowed(&body.inner.ommers),
withdrawals: body.inner.withdrawals.as_ref().map(Cow::Borrowed),
sidecars: body.sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: body.read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: body.highest_precompile_address.as_ref().map(Cow::Borrowed),
}
.length()
}
}

View File

@ -0,0 +1,80 @@
use alloy_consensus::BlobTransactionSidecar;
use alloy_primitives::Address;
use reth_primitives_traits::{BlockBody as BlockBodyTrait, InMemorySize};
use serde::{Deserialize, Serialize};
use crate::{
HlHeader,
node::{
primitives::TransactionSigned,
types::{ReadPrecompileCall, ReadPrecompileCalls},
},
};
/// Block body for HL. It is equivalent to Ethereum [`BlockBody`] but additionally stores sidecars
/// for blob transactions.
#[derive(
Debug,
Clone,
Default,
PartialEq,
Eq,
Serialize,
Deserialize,
derive_more::Deref,
derive_more::DerefMut,
)]
pub struct HlBlockBody {
#[serde(flatten)]
#[deref]
#[deref_mut]
pub inner: BlockBody,
pub sidecars: Option<Vec<BlobTransactionSidecar>>,
pub read_precompile_calls: Option<ReadPrecompileCalls>,
pub highest_precompile_address: Option<Address>,
}
pub type BlockBody = alloy_consensus::BlockBody<TransactionSigned, HlHeader>;
impl InMemorySize for HlBlockBody {
fn size(&self) -> usize {
self.inner.size() +
self.sidecars
.as_ref()
.map_or(0, |s| s.capacity() * core::mem::size_of::<BlobTransactionSidecar>()) +
self.read_precompile_calls
.as_ref()
.map_or(0, |s| s.0.capacity() * core::mem::size_of::<ReadPrecompileCall>())
}
}
impl BlockBodyTrait for HlBlockBody {
type Transaction = TransactionSigned;
type OmmerHeader = super::HlHeader;
fn transactions(&self) -> &[Self::Transaction] {
BlockBodyTrait::transactions(&self.inner)
}
fn into_ethereum_body(self) -> BlockBody {
self.inner
}
fn into_transactions(self) -> Vec<Self::Transaction> {
self.inner.into_transactions()
}
fn withdrawals(&self) -> Option<&alloy_rpc_types::Withdrawals> {
self.inner.withdrawals()
}
fn ommers(&self) -> Option<&[Self::OmmerHeader]> {
self.inner.ommers()
}
fn calculate_tx_root(&self) -> alloy_primitives::B256 {
alloy_consensus::proofs::calculate_transaction_root(
&self
.transactions()
.iter()
.filter(|tx| !tx.is_system_transaction())
.collect::<Vec<_>>(),
)
}
}

View File

@ -0,0 +1,246 @@
use alloy_consensus::Header;
use alloy_primitives::{Address, B64, B256, BlockNumber, Bloom, Bytes, Sealable, U256};
use alloy_rlp::{RlpDecodable, RlpEncodable};
use reth_cli_commands::common::CliHeader;
use reth_codecs::Compact;
use reth_ethereum_primitives::EthereumReceipt;
use reth_primitives::{SealedHeader, logs_bloom};
use reth_primitives_traits::{BlockHeader, InMemorySize, serde_bincode_compat::RlpBincode};
use reth_rpc_convert::transaction::FromConsensusHeader;
use serde::{Deserialize, Serialize};
/// The header type of this node
///
/// This type extends the regular ethereum header with an extension.
#[derive(
Clone,
Debug,
PartialEq,
Eq,
Hash,
derive_more::AsRef,
derive_more::Deref,
Default,
RlpEncodable,
RlpDecodable,
Serialize,
Deserialize,
)]
#[serde(rename_all = "camelCase")]
pub struct HlHeader {
/// The regular eth header
#[as_ref]
#[deref]
pub inner: Header,
/// The extended header fields that is not part of the block hash
pub extras: HlHeaderExtras,
}
#[derive(
Debug, Clone, Default, PartialEq, Eq, Serialize, Deserialize, RlpEncodable, RlpDecodable, Hash,
)]
pub struct HlHeaderExtras {
pub logs_bloom_with_system_txs: Bloom,
pub system_tx_count: u64,
}
impl HlHeader {
pub(crate) fn from_ethereum_header(
header: Header,
receipts: &[EthereumReceipt],
system_tx_count: u64,
) -> HlHeader {
let logs_bloom = logs_bloom(receipts.iter().flat_map(|r| &r.logs));
HlHeader {
inner: header,
extras: HlHeaderExtras { logs_bloom_with_system_txs: logs_bloom, system_tx_count },
}
}
}
impl From<Header> for HlHeader {
fn from(_value: Header) -> Self {
unreachable!()
}
}
impl AsRef<Self> for HlHeader {
fn as_ref(&self) -> &Self {
self
}
}
impl Sealable for HlHeader {
fn hash_slow(&self) -> B256 {
self.inner.hash_slow()
}
}
impl alloy_consensus::BlockHeader for HlHeader {
fn parent_hash(&self) -> B256 {
self.inner.parent_hash()
}
fn ommers_hash(&self) -> B256 {
self.inner.ommers_hash()
}
fn beneficiary(&self) -> Address {
self.inner.beneficiary()
}
fn state_root(&self) -> B256 {
self.inner.state_root()
}
fn transactions_root(&self) -> B256 {
self.inner.transactions_root()
}
fn receipts_root(&self) -> B256 {
self.inner.receipts_root()
}
fn withdrawals_root(&self) -> Option<B256> {
self.inner.withdrawals_root()
}
fn logs_bloom(&self) -> Bloom {
self.extras.logs_bloom_with_system_txs
}
fn difficulty(&self) -> U256 {
self.inner.difficulty()
}
fn number(&self) -> BlockNumber {
self.inner.number()
}
fn gas_limit(&self) -> u64 {
self.inner.gas_limit()
}
fn gas_used(&self) -> u64 {
self.inner.gas_used()
}
fn timestamp(&self) -> u64 {
self.inner.timestamp()
}
fn mix_hash(&self) -> Option<B256> {
self.inner.mix_hash()
}
fn nonce(&self) -> Option<B64> {
self.inner.nonce()
}
fn base_fee_per_gas(&self) -> Option<u64> {
self.inner.base_fee_per_gas()
}
fn blob_gas_used(&self) -> Option<u64> {
self.inner.blob_gas_used()
}
fn excess_blob_gas(&self) -> Option<u64> {
self.inner.excess_blob_gas()
}
fn parent_beacon_block_root(&self) -> Option<B256> {
self.inner.parent_beacon_block_root()
}
fn requests_hash(&self) -> Option<B256> {
self.inner.requests_hash()
}
fn extra_data(&self) -> &Bytes {
self.inner.extra_data()
}
fn is_empty(&self) -> bool {
self.extras.system_tx_count == 0 && self.inner.is_empty()
}
}
impl InMemorySize for HlHeader {
fn size(&self) -> usize {
self.inner.size() + self.extras.size()
}
}
impl InMemorySize for HlHeaderExtras {
fn size(&self) -> usize {
self.logs_bloom_with_system_txs.data().len() + self.system_tx_count.size()
}
}
impl reth_codecs::Compact for HlHeader {
fn to_compact<B>(&self, buf: &mut B) -> usize
where
B: alloy_rlp::bytes::BufMut + AsMut<[u8]>,
{
// Because Header ends with extra_data which is `Bytes`, we can't use to_compact for extras,
// because Compact trait requires the Bytes field to be placed at the end of the struct.
// Bytes::from_compact just reads all trailing data as the Bytes field.
//
// Hence we need to use other form of serialization, since extra headers are not
// Compact-compatible. We just treat all header fields as rmp-serialized one `Bytes`
// field.
let result: Bytes = rmp_serde::to_vec(&self).unwrap().into();
result.to_compact(buf)
}
fn from_compact(buf: &[u8], len: usize) -> (Self, &[u8]) {
let (bytes, remaining) = Bytes::from_compact(buf, len);
let header: HlHeader = rmp_serde::from_slice(&bytes).unwrap();
(header, remaining)
}
}
impl reth_db_api::table::Compress for HlHeader {
type Compressed = Vec<u8>;
fn compress_to_buf<B: alloy_primitives::bytes::BufMut + AsMut<[u8]>>(&self, buf: &mut B) {
let _ = Compact::to_compact(self, buf);
}
}
impl reth_db_api::table::Decompress for HlHeader {
fn decompress(value: &[u8]) -> Result<Self, reth_db_api::DatabaseError> {
let (obj, _) = Compact::from_compact(value, value.len());
Ok(obj)
}
}
impl BlockHeader for HlHeader {}
impl RlpBincode for HlHeader {}
impl CliHeader for HlHeader {
fn set_number(&mut self, number: u64) {
self.inner.set_number(number);
}
}
impl From<HlHeader> for Header {
fn from(value: HlHeader) -> Self {
value.inner
}
}
pub fn to_ethereum_ommers(ommers: &[HlHeader]) -> Vec<Header> {
ommers.iter().map(|ommer| ommer.clone().into()).collect()
}
impl FromConsensusHeader<HlHeader> for alloy_rpc_types::Header {
fn from_consensus_header(header: SealedHeader<HlHeader>, block_size: usize) -> Self {
FromConsensusHeader::<Header>::from_consensus_header(
SealedHeader::<Header>::new(header.inner.clone(), header.hash()),
block_size,
)
}
}

View File

@ -1,17 +1,18 @@
#![allow(clippy::owned_cow)]
use alloy_consensus::{BlobTransactionSidecar, Header};
use alloy_primitives::Address;
use alloy_rlp::{Encodable, RlpDecodable, RlpEncodable};
use reth_ethereum_primitives::Receipt; use reth_ethereum_primitives::Receipt;
use reth_primitives::NodePrimitives; use reth_primitives::NodePrimitives;
use reth_primitives_traits::{Block, BlockBody as BlockBodyTrait, InMemorySize};
use serde::{Deserialize, Serialize};
use std::borrow::Cow;
use crate::node::types::{ReadPrecompileCall, ReadPrecompileCalls}; pub mod transaction;
pub use transaction::TransactionSigned;
pub mod tx_wrapper; pub mod block;
pub use tx_wrapper::{BlockBody, TransactionSigned}; pub use block::HlBlock;
pub mod body;
pub use body::{BlockBody, HlBlockBody};
pub mod header;
pub use header::HlHeader;
pub mod rlp;
pub mod serde_bincode_compat;
/// Primitive types for HyperEVM. /// Primitive types for HyperEVM.
#[derive(Debug, Clone, Copy, Default, PartialEq, Eq)] #[derive(Debug, Clone, Copy, Default, PartialEq, Eq)]
@ -20,321 +21,8 @@ pub struct HlPrimitives;
impl NodePrimitives for HlPrimitives { impl NodePrimitives for HlPrimitives {
type Block = HlBlock; type Block = HlBlock;
type BlockHeader = Header; type BlockHeader = HlHeader;
type BlockBody = HlBlockBody; type BlockBody = HlBlockBody;
type SignedTx = TransactionSigned; type SignedTx = TransactionSigned;
type Receipt = Receipt; type Receipt = Receipt;
} }
/// Block body for HL. It is equivalent to Ethereum [`BlockBody`] but additionally stores sidecars
/// for blob transactions.
#[derive(
Debug,
Clone,
Default,
PartialEq,
Eq,
Serialize,
Deserialize,
derive_more::Deref,
derive_more::DerefMut,
)]
pub struct HlBlockBody {
#[serde(flatten)]
#[deref]
#[deref_mut]
pub inner: BlockBody,
pub sidecars: Option<Vec<BlobTransactionSidecar>>,
pub read_precompile_calls: Option<ReadPrecompileCalls>,
pub highest_precompile_address: Option<Address>,
}
impl InMemorySize for HlBlockBody {
fn size(&self) -> usize {
self.inner.size() +
self.sidecars
.as_ref()
.map_or(0, |s| s.capacity() * core::mem::size_of::<BlobTransactionSidecar>()) +
self.read_precompile_calls
.as_ref()
.map_or(0, |s| s.0.capacity() * core::mem::size_of::<ReadPrecompileCall>())
}
}
impl BlockBodyTrait for HlBlockBody {
type Transaction = TransactionSigned;
type OmmerHeader = Header;
fn transactions(&self) -> &[Self::Transaction] {
BlockBodyTrait::transactions(&self.inner)
}
fn into_ethereum_body(self) -> BlockBody {
self.inner
}
fn into_transactions(self) -> Vec<Self::Transaction> {
self.inner.into_transactions()
}
fn withdrawals(&self) -> Option<&alloy_rpc_types::Withdrawals> {
self.inner.withdrawals()
}
fn ommers(&self) -> Option<&[Self::OmmerHeader]> {
self.inner.ommers()
}
fn calculate_tx_root(&self) -> alloy_primitives::B256 {
alloy_consensus::proofs::calculate_transaction_root(
&self
.transactions()
.iter()
.filter(|tx| !tx.is_system_transaction())
.collect::<Vec<_>>(),
)
}
}
/// Block for HL
#[derive(Debug, Clone, Default, PartialEq, Eq, Serialize, Deserialize)]
pub struct HlBlock {
pub header: Header,
pub body: HlBlockBody,
}
impl InMemorySize for HlBlock {
fn size(&self) -> usize {
self.header.size() + self.body.size()
}
}
impl Block for HlBlock {
type Header = Header;
type Body = HlBlockBody;
fn new(header: Self::Header, body: Self::Body) -> Self {
Self { header, body }
}
fn header(&self) -> &Self::Header {
&self.header
}
fn body(&self) -> &Self::Body {
&self.body
}
fn split(self) -> (Self::Header, Self::Body) {
(self.header, self.body)
}
fn rlp_length(header: &Self::Header, body: &Self::Body) -> usize {
rlp::BlockHelper {
header: Cow::Borrowed(header),
transactions: Cow::Borrowed(&body.inner.transactions),
ommers: Cow::Borrowed(&body.inner.ommers),
withdrawals: body.inner.withdrawals.as_ref().map(Cow::Borrowed),
sidecars: body.sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: body.read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: body.highest_precompile_address.as_ref().map(Cow::Borrowed),
}
.length()
}
}
mod rlp {
use super::*;
use alloy_eips::eip4895::Withdrawals;
use alloy_rlp::Decodable;
#[derive(RlpEncodable, RlpDecodable)]
#[rlp(trailing)]
struct BlockBodyHelper<'a> {
transactions: Cow<'a, Vec<TransactionSigned>>,
ommers: Cow<'a, Vec<Header>>,
withdrawals: Option<Cow<'a, Withdrawals>>,
sidecars: Option<Cow<'a, Vec<BlobTransactionSidecar>>>,
read_precompile_calls: Option<Cow<'a, ReadPrecompileCalls>>,
highest_precompile_address: Option<Cow<'a, Address>>,
}
#[derive(RlpEncodable, RlpDecodable)]
#[rlp(trailing)]
pub(crate) struct BlockHelper<'a> {
pub(crate) header: Cow<'a, Header>,
pub(crate) transactions: Cow<'a, Vec<TransactionSigned>>,
pub(crate) ommers: Cow<'a, Vec<Header>>,
pub(crate) withdrawals: Option<Cow<'a, Withdrawals>>,
pub(crate) sidecars: Option<Cow<'a, Vec<BlobTransactionSidecar>>>,
pub(crate) read_precompile_calls: Option<Cow<'a, ReadPrecompileCalls>>,
pub(crate) highest_precompile_address: Option<Cow<'a, Address>>,
}
impl<'a> From<&'a HlBlockBody> for BlockBodyHelper<'a> {
fn from(value: &'a HlBlockBody) -> Self {
let HlBlockBody {
inner: BlockBody { transactions, ommers, withdrawals },
sidecars,
read_precompile_calls,
highest_precompile_address,
} = value;
Self {
transactions: Cow::Borrowed(transactions),
ommers: Cow::Borrowed(ommers),
withdrawals: withdrawals.as_ref().map(Cow::Borrowed),
sidecars: sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: highest_precompile_address.as_ref().map(Cow::Borrowed),
}
}
}
impl<'a> From<&'a HlBlock> for BlockHelper<'a> {
fn from(value: &'a HlBlock) -> Self {
let HlBlock {
header,
body:
HlBlockBody {
inner: BlockBody { transactions, ommers, withdrawals },
sidecars,
read_precompile_calls,
highest_precompile_address,
},
} = value;
Self {
header: Cow::Borrowed(header),
transactions: Cow::Borrowed(transactions),
ommers: Cow::Borrowed(ommers),
withdrawals: withdrawals.as_ref().map(Cow::Borrowed),
sidecars: sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: highest_precompile_address.as_ref().map(Cow::Borrowed),
}
}
}
impl Encodable for HlBlockBody {
fn encode(&self, out: &mut dyn bytes::BufMut) {
BlockBodyHelper::from(self).encode(out);
}
fn length(&self) -> usize {
BlockBodyHelper::from(self).length()
}
}
impl Decodable for HlBlockBody {
fn decode(buf: &mut &[u8]) -> alloy_rlp::Result<Self> {
let BlockBodyHelper {
transactions,
ommers,
withdrawals,
sidecars,
read_precompile_calls,
highest_precompile_address,
} = BlockBodyHelper::decode(buf)?;
Ok(Self {
inner: BlockBody {
transactions: transactions.into_owned(),
ommers: ommers.into_owned(),
withdrawals: withdrawals.map(|w| w.into_owned()),
},
sidecars: sidecars.map(|s| s.into_owned()),
read_precompile_calls: read_precompile_calls.map(|s| s.into_owned()),
highest_precompile_address: highest_precompile_address.map(|s| s.into_owned()),
})
}
}
impl Encodable for HlBlock {
fn encode(&self, out: &mut dyn bytes::BufMut) {
BlockHelper::from(self).encode(out);
}
fn length(&self) -> usize {
BlockHelper::from(self).length()
}
}
impl Decodable for HlBlock {
fn decode(buf: &mut &[u8]) -> alloy_rlp::Result<Self> {
let BlockHelper {
header,
transactions,
ommers,
withdrawals,
sidecars,
read_precompile_calls,
highest_precompile_address,
} = BlockHelper::decode(buf)?;
Ok(Self {
header: header.into_owned(),
body: HlBlockBody {
inner: BlockBody {
transactions: transactions.into_owned(),
ommers: ommers.into_owned(),
withdrawals: withdrawals.map(|w| w.into_owned()),
},
sidecars: sidecars.map(|s| s.into_owned()),
read_precompile_calls: read_precompile_calls.map(|s| s.into_owned()),
highest_precompile_address: highest_precompile_address.map(|s| s.into_owned()),
},
})
}
}
}
pub mod serde_bincode_compat {
use super::*;
use reth_primitives_traits::serde_bincode_compat::{BincodeReprFor, SerdeBincodeCompat};
#[derive(Debug, Serialize, Deserialize)]
pub struct HlBlockBodyBincode<'a> {
inner: BincodeReprFor<'a, BlockBody>,
sidecars: Option<Cow<'a, Vec<BlobTransactionSidecar>>>,
read_precompile_calls: Option<Cow<'a, ReadPrecompileCalls>>,
highest_precompile_address: Option<Cow<'a, Address>>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct HlBlockBincode<'a> {
header: BincodeReprFor<'a, Header>,
body: BincodeReprFor<'a, HlBlockBody>,
}
impl SerdeBincodeCompat for HlBlockBody {
type BincodeRepr<'a> = HlBlockBodyBincode<'a>;
fn as_repr(&self) -> Self::BincodeRepr<'_> {
HlBlockBodyBincode {
inner: self.inner.as_repr(),
sidecars: self.sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: self.read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: self
.highest_precompile_address
.as_ref()
.map(Cow::Borrowed),
}
}
fn from_repr(repr: Self::BincodeRepr<'_>) -> Self {
let HlBlockBodyBincode {
inner,
sidecars,
read_precompile_calls,
highest_precompile_address,
} = repr;
Self {
inner: BlockBody::from_repr(inner),
sidecars: sidecars.map(|s| s.into_owned()),
read_precompile_calls: read_precompile_calls.map(|s| s.into_owned()),
highest_precompile_address: highest_precompile_address.map(|s| s.into_owned()),
}
}
}
impl SerdeBincodeCompat for HlBlock {
type BincodeRepr<'a> = HlBlockBincode<'a>;
fn as_repr(&self) -> Self::BincodeRepr<'_> {
HlBlockBincode { header: self.header.as_repr(), body: self.body.as_repr() }
}
fn from_repr(repr: Self::BincodeRepr<'_>) -> Self {
let HlBlockBincode { header, body } = repr;
Self { header: Header::from_repr(header), body: HlBlockBody::from_repr(body) }
}
}
}

142
src/node/primitives/rlp.rs Normal file
View File

@ -0,0 +1,142 @@
#![allow(clippy::owned_cow)]
use super::{HlBlock, HlBlockBody, TransactionSigned};
use crate::{HlHeader, node::types::ReadPrecompileCalls};
use alloy_consensus::{BlobTransactionSidecar, BlockBody};
use alloy_eips::eip4895::Withdrawals;
use alloy_primitives::Address;
use alloy_rlp::{Decodable, Encodable, RlpDecodable, RlpEncodable};
use std::borrow::Cow;
#[derive(RlpEncodable, RlpDecodable)]
#[rlp(trailing)]
struct BlockBodyHelper<'a> {
transactions: Cow<'a, Vec<TransactionSigned>>,
ommers: Cow<'a, Vec<HlHeader>>,
withdrawals: Option<Cow<'a, Withdrawals>>,
sidecars: Option<Cow<'a, Vec<BlobTransactionSidecar>>>,
read_precompile_calls: Option<Cow<'a, ReadPrecompileCalls>>,
highest_precompile_address: Option<Cow<'a, Address>>,
}
#[derive(RlpEncodable, RlpDecodable)]
#[rlp(trailing)]
pub(crate) struct BlockHelper<'a> {
pub(crate) header: Cow<'a, HlHeader>,
pub(crate) transactions: Cow<'a, Vec<TransactionSigned>>,
pub(crate) ommers: Cow<'a, Vec<HlHeader>>,
pub(crate) withdrawals: Option<Cow<'a, Withdrawals>>,
pub(crate) sidecars: Option<Cow<'a, Vec<BlobTransactionSidecar>>>,
pub(crate) read_precompile_calls: Option<Cow<'a, ReadPrecompileCalls>>,
pub(crate) highest_precompile_address: Option<Cow<'a, Address>>,
}
impl<'a> From<&'a HlBlockBody> for BlockBodyHelper<'a> {
fn from(value: &'a HlBlockBody) -> Self {
let HlBlockBody {
inner: BlockBody { transactions, ommers, withdrawals },
sidecars,
read_precompile_calls,
highest_precompile_address,
} = value;
Self {
transactions: Cow::Borrowed(transactions),
ommers: Cow::Borrowed(ommers),
withdrawals: withdrawals.as_ref().map(Cow::Borrowed),
sidecars: sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: highest_precompile_address.as_ref().map(Cow::Borrowed),
}
}
}
impl<'a> From<&'a HlBlock> for BlockHelper<'a> {
fn from(value: &'a HlBlock) -> Self {
let HlBlock {
header,
body:
HlBlockBody {
inner: BlockBody { transactions, ommers, withdrawals },
sidecars,
read_precompile_calls,
highest_precompile_address,
},
} = value;
Self {
header: Cow::Borrowed(header),
transactions: Cow::Borrowed(transactions),
ommers: Cow::Borrowed(ommers),
withdrawals: withdrawals.as_ref().map(Cow::Borrowed),
sidecars: sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: highest_precompile_address.as_ref().map(Cow::Borrowed),
}
}
}
impl Encodable for HlBlockBody {
fn encode(&self, out: &mut dyn bytes::BufMut) {
BlockBodyHelper::from(self).encode(out);
}
fn length(&self) -> usize {
BlockBodyHelper::from(self).length()
}
}
impl Decodable for HlBlockBody {
fn decode(buf: &mut &[u8]) -> alloy_rlp::Result<Self> {
let BlockBodyHelper {
transactions,
ommers,
withdrawals,
sidecars,
read_precompile_calls,
highest_precompile_address,
} = BlockBodyHelper::decode(buf)?;
Ok(Self {
inner: BlockBody {
transactions: transactions.into_owned(),
ommers: ommers.into_owned(),
withdrawals: withdrawals.map(|w| w.into_owned()),
},
sidecars: sidecars.map(|s| s.into_owned()),
read_precompile_calls: read_precompile_calls.map(|s| s.into_owned()),
highest_precompile_address: highest_precompile_address.map(|s| s.into_owned()),
})
}
}
impl Encodable for HlBlock {
fn encode(&self, out: &mut dyn bytes::BufMut) {
BlockHelper::from(self).encode(out);
}
fn length(&self) -> usize {
BlockHelper::from(self).length()
}
}
impl Decodable for HlBlock {
fn decode(buf: &mut &[u8]) -> alloy_rlp::Result<Self> {
let BlockHelper {
header,
transactions,
ommers,
withdrawals,
sidecars,
read_precompile_calls,
highest_precompile_address,
} = BlockHelper::decode(buf)?;
Ok(Self {
header: header.into_owned(),
body: HlBlockBody {
inner: BlockBody {
transactions: transactions.into_owned(),
ommers: ommers.into_owned(),
withdrawals: withdrawals.map(|w| w.into_owned()),
},
sidecars: sidecars.map(|s| s.into_owned()),
read_precompile_calls: read_precompile_calls.map(|s| s.into_owned()),
highest_precompile_address: highest_precompile_address.map(|s| s.into_owned()),
},
})
}
}

View File

@ -0,0 +1,67 @@
#![allow(clippy::owned_cow)]
use alloy_consensus::BlobTransactionSidecar;
use alloy_primitives::Address;
use reth_primitives_traits::serde_bincode_compat::{BincodeReprFor, SerdeBincodeCompat};
use serde::{Deserialize, Serialize};
use std::borrow::Cow;
use super::{HlBlock, HlBlockBody};
use crate::{
HlHeader,
node::{primitives::BlockBody, types::ReadPrecompileCalls},
};
#[derive(Debug, Serialize, Deserialize)]
pub struct HlBlockBodyBincode<'a> {
inner: BincodeReprFor<'a, BlockBody>,
sidecars: Option<Cow<'a, Vec<BlobTransactionSidecar>>>,
read_precompile_calls: Option<Cow<'a, ReadPrecompileCalls>>,
highest_precompile_address: Option<Cow<'a, Address>>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct HlBlockBincode<'a> {
header: BincodeReprFor<'a, HlHeader>,
body: BincodeReprFor<'a, HlBlockBody>,
}
impl SerdeBincodeCompat for HlBlockBody {
type BincodeRepr<'a> = HlBlockBodyBincode<'a>;
fn as_repr(&self) -> Self::BincodeRepr<'_> {
HlBlockBodyBincode {
inner: self.inner.as_repr(),
sidecars: self.sidecars.as_ref().map(Cow::Borrowed),
read_precompile_calls: self.read_precompile_calls.as_ref().map(Cow::Borrowed),
highest_precompile_address: self.highest_precompile_address.as_ref().map(Cow::Borrowed),
}
}
fn from_repr(repr: Self::BincodeRepr<'_>) -> Self {
let HlBlockBodyBincode {
inner,
sidecars,
read_precompile_calls,
highest_precompile_address,
} = repr;
Self {
inner: BlockBody::from_repr(inner),
sidecars: sidecars.map(|s| s.into_owned()),
read_precompile_calls: read_precompile_calls.map(|s| s.into_owned()),
highest_precompile_address: highest_precompile_address.map(|s| s.into_owned()),
}
}
}
impl SerdeBincodeCompat for HlBlock {
type BincodeRepr<'a> = HlBlockBincode<'a>;
fn as_repr(&self) -> Self::BincodeRepr<'_> {
HlBlockBincode { header: self.header.as_repr(), body: self.body.as_repr() }
}
fn from_repr(repr: Self::BincodeRepr<'_>) -> Self {
let HlBlockBincode { header, body } = repr;
Self { header: HlHeader::from_repr(header), body: HlBlockBody::from_repr(body) }
}
}

View File

@ -181,8 +181,6 @@ impl SerdeBincodeCompat for TransactionSigned {
} }
} }
pub type BlockBody = alloy_consensus::BlockBody<TransactionSigned>;
impl TryFrom<TransactionSigned> for PooledTransactionVariant { impl TryFrom<TransactionSigned> for PooledTransactionVariant {
type Error = <InnerType as TryInto<PooledTransactionVariant>>::Error; type Error = <InnerType as TryInto<PooledTransactionVariant>>::Error;
@ -211,22 +209,6 @@ impl Decompress for TransactionSigned {
} }
} }
pub fn convert_to_eth_block_body(value: BlockBody) -> alloy_consensus::BlockBody<InnerType> {
alloy_consensus::BlockBody {
transactions: value.transactions.into_iter().map(|tx| tx.into_inner()).collect(),
ommers: value.ommers,
withdrawals: value.withdrawals,
}
}
pub fn convert_to_hl_block_body(value: alloy_consensus::BlockBody<InnerType>) -> BlockBody {
BlockBody {
transactions: value.transactions.into_iter().map(TransactionSigned::Default).collect(),
ommers: value.ommers,
withdrawals: value.withdrawals,
}
}
impl TryIntoSimTx<TransactionSigned> for TransactionRequest { impl TryIntoSimTx<TransactionSigned> for TransactionRequest {
fn try_into_sim_tx(self) -> Result<TransactionSigned, ValueError<Self>> { fn try_into_sim_tx(self) -> Result<TransactionSigned, ValueError<Self>> {
let tx = self let tx = self

View File

@ -82,9 +82,10 @@ where
let mut tx_env = self.create_txn_env(&evm_env, request, &mut db)?; let mut tx_env = self.create_txn_env(&evm_env, request, &mut db)?;
let mut is_basic_transfer = false; let mut is_basic_transfer = false;
if tx_env.input().is_empty() if tx_env.input().is_empty() &&
&& let TxKind::Call(to) = tx_env.kind() let TxKind::Call(to) = tx_env.kind() &&
&& let Ok(code) = db.db.account_code(&to) { let Ok(code) = db.db.account_code(&to)
{
is_basic_transfer = code.map(|code| code.is_empty()).unwrap_or(true); is_basic_transfer = code.map(|code| code.is_empty()).unwrap_or(true);
} }
@ -105,8 +106,9 @@ where
let mut min_tx_env = tx_env.clone(); let mut min_tx_env = tx_env.clone();
min_tx_env.set_gas_limit(MIN_TRANSACTION_GAS); min_tx_env.set_gas_limit(MIN_TRANSACTION_GAS);
if let Ok(res) = evm.transact(min_tx_env).map_err(Self::Error::from_evm_err) if let Ok(res) = evm.transact(min_tx_env).map_err(Self::Error::from_evm_err) &&
&& res.result.is_success() { res.result.is_success()
{
return Ok(U256::from(MIN_TRANSACTION_GAS)); return Ok(U256::from(MIN_TRANSACTION_GAS));
} }
} }

103
src/node/spot_meta/init.rs Normal file
View File

@ -0,0 +1,103 @@
use crate::node::{
spot_meta::{SpotId, erc20_contract_to_spot_token},
storage::tables::{self, SPOT_METADATA_KEY},
types::reth_compat,
};
use alloy_primitives::Address;
use reth_db::{
DatabaseEnv,
cursor::DbCursorRO,
};
use reth_db_api::{
Database,
transaction::DbTx,
};
use std::{collections::BTreeMap, sync::Arc};
use tracing::info;
/// Load spot metadata from database and initialize cache
pub fn load_spot_metadata_cache(db: &Arc<DatabaseEnv>, chain_id: u64) {
// Try to read from database
let data = match db.view(|tx| -> Result<Option<Vec<u8>>, reth_db::DatabaseError> {
let mut cursor = tx.cursor_read::<tables::SpotMetadata>()?;
Ok(cursor.seek_exact(SPOT_METADATA_KEY)?.map(|(_, data)| data.to_vec()))
}) {
Ok(Ok(data)) => data,
Ok(Err(e)) => {
info!(
"Failed to read spot metadata from database: {}. Will fetch on-demand from API.",
e
);
return;
}
Err(e) => {
info!(
"Database view error while loading spot metadata: {}. Will fetch on-demand from API.",
e
);
return;
}
};
// Check if data exists
let Some(data) = data else {
info!(
"No spot metadata found in database for chain {}. Run 'init-state' to populate, or it will be fetched on-demand from API.",
chain_id
);
return;
};
// Deserialize metadata
let serializable_map = match rmp_serde::from_slice::<BTreeMap<Address, u64>>(&data) {
Ok(map) => map,
Err(e) => {
info!("Failed to deserialize spot metadata: {}. Will fetch on-demand from API.", e);
return;
}
};
// Convert and initialize cache
let metadata: BTreeMap<Address, SpotId> =
serializable_map.into_iter().map(|(addr, index)| (addr, SpotId { index })).collect();
info!("Loaded spot metadata from database ({} entries)", metadata.len());
reth_compat::initialize_spot_metadata_cache(metadata);
}
/// Initialize spot metadata in database from API
pub fn init_spot_metadata(
db_path: impl AsRef<std::path::Path>,
db_args: reth_db::mdbx::DatabaseArguments,
chain_id: u64,
) -> eyre::Result<()> {
info!("Initializing spot metadata for chain {}", chain_id);
let db = Arc::new(reth_db::open_db(db_path.as_ref(), db_args)?);
// Check if spot metadata already exists
let exists = db.view(|tx| -> Result<bool, reth_db::DatabaseError> {
let mut cursor = tx.cursor_read::<tables::SpotMetadata>()?;
Ok(cursor.seek_exact(SPOT_METADATA_KEY)?.is_some())
})??;
if exists {
info!("Spot metadata already exists in database");
return Ok(());
}
// Fetch from API
let metadata = match erc20_contract_to_spot_token(chain_id) {
Ok(m) => m,
Err(e) => {
info!("Failed to fetch spot metadata from API: {}. Will be fetched on-demand.", e);
return Ok(());
}
};
// Store to database
reth_compat::store_spot_metadata(&db, &metadata)?;
info!("Successfully fetched and stored spot metadata for chain {}", chain_id);
Ok(())
}

View File

@ -5,6 +5,7 @@ use std::collections::BTreeMap;
use crate::chainspec::{MAINNET_CHAIN_ID, TESTNET_CHAIN_ID}; use crate::chainspec::{MAINNET_CHAIN_ID, TESTNET_CHAIN_ID};
pub mod init;
mod patch; mod patch;
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
@ -25,7 +26,7 @@ pub struct SpotMeta {
} }
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub(crate) struct SpotId { pub struct SpotId {
pub index: u64, pub index: u64,
} }

View File

@ -1,9 +1,6 @@
use crate::{ use crate::{
HlBlock, HlBlockBody, HlPrimitives, HlBlock, HlBlockBody, HlHeader, HlPrimitives,
node::{ node::{primitives::TransactionSigned, types::HlExtras},
primitives::tx_wrapper::{convert_to_eth_block_body, convert_to_hl_block_body},
types::HlExtras,
},
}; };
use alloy_consensus::BlockHeader; use alloy_consensus::BlockHeader;
use alloy_primitives::Bytes; use alloy_primitives::Bytes;
@ -13,6 +10,7 @@ use reth_db::{
cursor::{DbCursorRO, DbCursorRW}, cursor::{DbCursorRO, DbCursorRW},
transaction::{DbTx, DbTxMut}, transaction::{DbTx, DbTxMut},
}; };
use reth_primitives_traits::Block;
use reth_provider::{ use reth_provider::{
BlockBodyReader, BlockBodyWriter, ChainSpecProvider, ChainStorageReader, ChainStorageWriter, BlockBodyReader, BlockBodyWriter, ChainSpecProvider, ChainStorageReader, ChainStorageWriter,
DBProvider, DatabaseProvider, EthStorage, ProviderResult, ReadBodyInput, StorageLocation, DBProvider, DatabaseProvider, EthStorage, ProviderResult, ReadBodyInput, StorageLocation,
@ -23,7 +21,7 @@ pub mod tables;
#[derive(Debug, Clone, Default)] #[derive(Debug, Clone, Default)]
#[non_exhaustive] #[non_exhaustive]
pub struct HlStorage(EthStorage); pub struct HlStorage(EthStorage<TransactionSigned, HlHeader>);
impl HlStorage { impl HlStorage {
fn write_precompile_calls<Provider>( fn write_precompile_calls<Provider>(
@ -89,30 +87,17 @@ where
let mut read_precompile_calls = Vec::with_capacity(bodies.len()); let mut read_precompile_calls = Vec::with_capacity(bodies.len());
for (block_number, body) in bodies { for (block_number, body) in bodies {
match body { let (inner_opt, extras) = match body {
Some(HlBlockBody { Some(HlBlockBody {
inner, inner,
sidecars: _, sidecars: _,
read_precompile_calls: rpc, read_precompile_calls,
highest_precompile_address, highest_precompile_address,
}) => { }) => (Some(inner), HlExtras { read_precompile_calls, highest_precompile_address }),
eth_bodies.push((block_number, Some(convert_to_eth_block_body(inner)))); None => Default::default(),
read_precompile_calls.push(( };
block_number, eth_bodies.push((block_number, inner_opt));
HlExtras { read_precompile_calls: rpc, highest_precompile_address }, read_precompile_calls.push((block_number, extras));
));
}
None => {
eth_bodies.push((block_number, None));
read_precompile_calls.push((
block_number,
HlExtras {
read_precompile_calls: Default::default(),
highest_precompile_address: None,
},
));
}
}
} }
self.0.write_block_bodies(provider, eth_bodies, write_to)?; self.0.write_block_bodies(provider, eth_bodies, write_to)?;
@ -146,22 +131,16 @@ where
inputs: Vec<ReadBodyInput<'_, Self::Block>>, inputs: Vec<ReadBodyInput<'_, Self::Block>>,
) -> ProviderResult<Vec<HlBlockBody>> { ) -> ProviderResult<Vec<HlBlockBody>> {
let read_precompile_calls = self.read_precompile_calls(provider, &inputs)?; let read_precompile_calls = self.read_precompile_calls(provider, &inputs)?;
let eth_bodies = self.0.read_block_bodies( let inputs: Vec<(&<Self::Block as Block>::Header, _)> = inputs;
provider, let eth_bodies = self.0.read_block_bodies(provider, inputs)?;
inputs let eth_bodies: Vec<alloy_consensus::BlockBody<_, HlHeader>> = eth_bodies;
.into_iter()
.map(|(header, transactions)| {
(header, transactions.into_iter().map(|tx| tx.into_inner()).collect())
})
.collect(),
)?;
// NOTE: sidecars are not used in HyperEVM yet. // NOTE: sidecars are not used in HyperEVM yet.
Ok(eth_bodies Ok(eth_bodies
.into_iter() .into_iter()
.zip(read_precompile_calls) .zip(read_precompile_calls)
.map(|(inner, extra)| HlBlockBody { .map(|(inner, extra)| HlBlockBody {
inner: convert_to_hl_block_body(inner), inner,
sidecars: None, sidecars: None,
read_precompile_calls: extra.read_precompile_calls, read_precompile_calls: extra.read_precompile_calls,
highest_precompile_address: extra.highest_precompile_address, highest_precompile_address: extra.highest_precompile_address,

View File

@ -2,10 +2,21 @@ use alloy_primitives::{BlockNumber, Bytes};
use reth_db::{TableSet, TableType, TableViewer, table::TableInfo, tables}; use reth_db::{TableSet, TableType, TableViewer, table::TableInfo, tables};
use std::fmt; use std::fmt;
/// Static key used for spot metadata, as the database is unique to each chain.
/// This may later serve as a versioning key to assist with future database migrations.
pub const SPOT_METADATA_KEY: u64 = 0;
tables! { tables! {
/// Read precompile calls for each block. /// Read precompile calls for each block.
table BlockReadPrecompileCalls { table BlockReadPrecompileCalls {
type Key = BlockNumber; type Key = BlockNumber;
type Value = Bytes; type Value = Bytes;
} }
/// Spot metadata mapping (EVM address to spot token index).
/// Uses a constant key since the database is chain-specific.
table SpotMetadata {
type Key = u64;
type Value = Bytes;
}
} }

View File

@ -2,26 +2,39 @@
//! //!
//! Changes: //! Changes:
//! - ReadPrecompileCalls supports RLP encoding / decoding //! - ReadPrecompileCalls supports RLP encoding / decoding
use alloy_consensus::TxType;
use alloy_primitives::{Address, B256, Bytes, Log}; use alloy_primitives::{Address, B256, Bytes, Log};
use alloy_rlp::{Decodable, Encodable, RlpDecodable, RlpEncodable}; use alloy_rlp::{Decodable, Encodable, RlpDecodable, RlpEncodable};
use bytes::BufMut; use bytes::BufMut;
use reth_ethereum_primitives::EthereumReceipt;
use reth_primitives_traits::InMemorySize;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use crate::HlBlock; use crate::HlBlock;
pub type ReadPrecompileCall = (Address, Vec<(ReadPrecompileInput, ReadPrecompileResult)>); pub type ReadPrecompileCall = (Address, Vec<(ReadPrecompileInput, ReadPrecompileResult)>);
#[derive(Debug, Clone, Serialize, Deserialize, Eq, PartialEq, Default)] #[derive(Debug, Clone, Serialize, Deserialize, Eq, PartialEq, Default, Hash)]
pub struct ReadPrecompileCalls(pub Vec<ReadPrecompileCall>); pub struct ReadPrecompileCalls(pub Vec<ReadPrecompileCall>);
pub(crate) mod reth_compat; pub(crate) mod reth_compat;
// Re-export spot metadata functions
pub use reth_compat::{initialize_spot_metadata_cache, set_spot_metadata_db};
#[derive(Debug, Clone, Serialize, Deserialize, Default)] #[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct HlExtras { pub struct HlExtras {
pub read_precompile_calls: Option<ReadPrecompileCalls>, pub read_precompile_calls: Option<ReadPrecompileCalls>,
pub highest_precompile_address: Option<Address>, pub highest_precompile_address: Option<Address>,
} }
impl InMemorySize for HlExtras {
fn size(&self) -> usize {
self.read_precompile_calls.as_ref().map_or(0, |s| s.0.len()) +
self.highest_precompile_address.as_ref().map_or(0, |_| 20)
}
}
impl Encodable for ReadPrecompileCalls { impl Encodable for ReadPrecompileCalls {
fn encode(&self, out: &mut dyn BufMut) { fn encode(&self, out: &mut dyn BufMut) {
let buf: Bytes = rmp_serde::to_vec(&self.0).unwrap().into(); let buf: Bytes = rmp_serde::to_vec(&self.0).unwrap().into();
@ -56,6 +69,7 @@ impl BlockAndReceipts {
self.read_precompile_calls.clone(), self.read_precompile_calls.clone(),
self.highest_precompile_address, self.highest_precompile_address,
self.system_txs.clone(), self.system_txs.clone(),
self.receipts.clone(),
chain_id, chain_id,
) )
} }
@ -84,6 +98,23 @@ pub struct LegacyReceipt {
logs: Vec<Log>, logs: Vec<Log>,
} }
impl From<LegacyReceipt> for EthereumReceipt {
fn from(r: LegacyReceipt) -> Self {
EthereumReceipt {
tx_type: match r.tx_type {
LegacyTxType::Legacy => TxType::Legacy,
LegacyTxType::Eip2930 => TxType::Eip2930,
LegacyTxType::Eip1559 => TxType::Eip1559,
LegacyTxType::Eip4844 => TxType::Eip4844,
LegacyTxType::Eip7702 => TxType::Eip7702,
},
success: r.success,
cumulative_gas_used: r.cumulative_gas_used,
logs: r.logs,
}
}
}
#[derive(Debug, Clone, Serialize, Deserialize, Eq, PartialEq)] #[derive(Debug, Clone, Serialize, Deserialize, Eq, PartialEq)]
enum LegacyTxType { enum LegacyTxType {
Legacy = 0, Legacy = 0,
@ -99,6 +130,19 @@ pub struct SystemTx {
pub receipt: Option<LegacyReceipt>, pub receipt: Option<LegacyReceipt>,
} }
impl SystemTx {
pub fn gas_limit(&self) -> u64 {
use reth_compat::Transaction;
match &self.tx {
Transaction::Legacy(tx) => tx.gas_limit,
Transaction::Eip2930(tx) => tx.gas_limit,
Transaction::Eip1559(tx) => tx.gas_limit,
Transaction::Eip4844(tx) => tx.gas_limit,
Transaction::Eip7702(tx) => tx.gas_limit,
}
}
}
#[derive( #[derive(
Debug, Debug,
Clone, Clone,
@ -117,7 +161,7 @@ pub struct ReadPrecompileInput {
pub gas_limit: u64, pub gas_limit: u64,
} }
#[derive(Debug, Clone, Serialize, Deserialize, Eq, PartialEq)] #[derive(Debug, Clone, Serialize, Deserialize, Eq, PartialEq, Hash)]
pub enum ReadPrecompileResult { pub enum ReadPrecompileResult {
Ok { gas_used: u64, bytes: Bytes }, Ok { gas_used: u64, bytes: Bytes },
OutOfGas, OutOfGas,

View File

@ -1,20 +1,23 @@
//! Copy of reth codebase to preserve serialization compatibility //! Copy of reth codebase to preserve serialization compatibility
use crate::node::storage::tables::{SPOT_METADATA_KEY, SpotMetadata};
use alloy_consensus::{Header, Signed, TxEip1559, TxEip2930, TxEip4844, TxEip7702, TxLegacy}; use alloy_consensus::{Header, Signed, TxEip1559, TxEip2930, TxEip4844, TxEip7702, TxLegacy};
use alloy_primitives::{Address, BlockHash, Signature, TxKind, U256}; use alloy_primitives::{Address, BlockHash, Bytes, Signature, TxKind, U256};
use reth_db::{DatabaseEnv, DatabaseError, cursor::DbCursorRW};
use reth_db_api::{Database, transaction::DbTxMut};
use reth_primitives::TransactionSigned as RethTxSigned; use reth_primitives::TransactionSigned as RethTxSigned;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::{ use std::{
collections::BTreeMap, collections::BTreeMap,
sync::{Arc, LazyLock, RwLock}, sync::{Arc, LazyLock, Mutex, RwLock},
}; };
use tracing::info; use tracing::info;
use crate::{ use crate::{
HlBlock, HlBlockBody, HlBlock, HlBlockBody, HlHeader,
node::{ node::{
primitives::TransactionSigned as TxSigned, primitives::TransactionSigned as TxSigned,
spot_meta::{SpotId, erc20_contract_to_spot_token}, spot_meta::{SpotId, erc20_contract_to_spot_token},
types::{ReadPrecompileCalls, SystemTx}, types::{LegacyReceipt, ReadPrecompileCalls, SystemTx},
}, },
}; };
@ -81,10 +84,56 @@ pub struct SealedBlock {
pub body: BlockBody, pub body: BlockBody,
} }
fn system_tx_to_reth_transaction(transaction: &SystemTx, chain_id: u64) -> TxSigned { static SPOT_EVM_MAP: LazyLock<Arc<RwLock<BTreeMap<Address, SpotId>>>> =
static EVM_MAP: LazyLock<Arc<RwLock<BTreeMap<Address, SpotId>>>> =
LazyLock::new(|| Arc::new(RwLock::new(BTreeMap::new()))); LazyLock::new(|| Arc::new(RwLock::new(BTreeMap::new())));
{
// Optional database handle for persisting on-demand fetches
static DB_HANDLE: LazyLock<Mutex<Option<Arc<DatabaseEnv>>>> = LazyLock::new(|| Mutex::new(None));
/// Set the database handle for persisting spot metadata
pub fn set_spot_metadata_db(db: Arc<DatabaseEnv>) {
*DB_HANDLE.lock().unwrap() = Some(db);
}
/// Initialize the spot metadata cache with data loaded from database.
/// This should be called during node initialization.
pub fn initialize_spot_metadata_cache(metadata: BTreeMap<Address, SpotId>) {
*SPOT_EVM_MAP.write().unwrap() = metadata;
}
/// Helper function to serialize and store spot metadata to database
pub fn store_spot_metadata(
db: &Arc<DatabaseEnv>,
metadata: &BTreeMap<Address, SpotId>,
) -> Result<(), DatabaseError> {
db.update(|tx| {
let mut cursor = tx.cursor_write::<SpotMetadata>()?;
// Serialize to BTreeMap<Address, u64>
let serializable_map: BTreeMap<Address, u64> =
metadata.iter().map(|(addr, spot)| (*addr, spot.index)).collect();
cursor.upsert(
SPOT_METADATA_KEY,
&Bytes::from(
rmp_serde::to_vec(&serializable_map).expect("Failed to serialize spot metadata"),
),
)?;
Ok(())
})?
}
/// Persist spot metadata to database if handle is available
fn persist_spot_metadata_to_db(metadata: &BTreeMap<Address, SpotId>) {
if let Some(db) = DB_HANDLE.lock().unwrap().as_ref() {
match store_spot_metadata(db, metadata) {
Ok(_) => info!("Persisted spot metadata to database"),
Err(e) => info!("Failed to persist spot metadata to database: {}", e),
}
}
}
fn system_tx_to_reth_transaction(transaction: &SystemTx, chain_id: u64) -> TxSigned {
let Transaction::Legacy(tx) = &transaction.tx else { let Transaction::Legacy(tx) = &transaction.tx else {
panic!("Unexpected transaction type"); panic!("Unexpected transaction type");
}; };
@ -95,41 +144,60 @@ fn system_tx_to_reth_transaction(transaction: &SystemTx, chain_id: u64) -> TxSig
U256::from(0x1) U256::from(0x1)
} else { } else {
loop { loop {
if let Some(spot) = EVM_MAP.read().unwrap().get(&to) { if let Some(spot) = SPOT_EVM_MAP.read().unwrap().get(&to) {
break spot.to_s(); break spot.to_s();
} }
info!("Contract not found: {to:?} from spot mapping, fetching again..."); // Cache miss - fetch from API, update cache, and persist to database
*EVM_MAP.write().unwrap() = erc20_contract_to_spot_token(chain_id).unwrap(); info!("Contract not found: {to:?} from spot mapping, fetching from API...");
let metadata = erc20_contract_to_spot_token(chain_id).unwrap();
*SPOT_EVM_MAP.write().unwrap() = metadata.clone();
persist_spot_metadata_to_db(&metadata);
} }
}; };
let signature = Signature::new(U256::from(0x1), s, true); let signature = Signature::new(U256::from(0x1), s, true);
TxSigned::Default(RethTxSigned::Legacy(Signed::new_unhashed(tx.clone(), signature))) TxSigned::Default(RethTxSigned::Legacy(Signed::new_unhashed(tx.clone(), signature)))
} }
}
impl SealedBlock { impl SealedBlock {
pub fn to_reth_block( pub fn to_reth_block(
&self, &self,
read_precompile_calls: ReadPrecompileCalls, read_precompile_calls: ReadPrecompileCalls,
highest_precompile_address: Option<Address>, highest_precompile_address: Option<Address>,
system_txs: Vec<super::SystemTx>, mut system_txs: Vec<super::SystemTx>,
receipts: Vec<LegacyReceipt>,
chain_id: u64, chain_id: u64,
) -> HlBlock { ) -> HlBlock {
// NOTE: These types of transactions are tracked at #97.
system_txs.retain(|tx| tx.receipt.is_some());
let mut merged_txs = vec![]; let mut merged_txs = vec![];
merged_txs.extend(system_txs.iter().map(|tx| system_tx_to_reth_transaction(tx, chain_id))); merged_txs.extend(system_txs.iter().map(|tx| system_tx_to_reth_transaction(tx, chain_id)));
merged_txs.extend(self.body.transactions.iter().map(|tx| tx.to_reth_transaction())); merged_txs.extend(self.body.transactions.iter().map(|tx| tx.to_reth_transaction()));
let mut merged_receipts = vec![];
merged_receipts.extend(system_txs.iter().map(|tx| tx.receipt.clone().unwrap().into()));
merged_receipts.extend(receipts.into_iter().map(From::from));
let block_body = HlBlockBody { let block_body = HlBlockBody {
inner: reth_primitives::BlockBody { inner: reth_primitives::BlockBody {
transactions: merged_txs, transactions: merged_txs,
withdrawals: self.body.withdrawals.clone(), withdrawals: self.body.withdrawals.clone(),
ommers: self.body.ommers.clone(), ommers: vec![],
}, },
sidecars: None, sidecars: None,
read_precompile_calls: Some(read_precompile_calls), read_precompile_calls: Some(read_precompile_calls),
highest_precompile_address, highest_precompile_address,
}; };
HlBlock { header: self.header.header.clone(), body: block_body } let system_tx_count = system_txs.len() as u64;
HlBlock {
header: HlHeader::from_ethereum_header(
self.header.header.clone(),
&merged_receipts,
system_tx_count,
),
body: block_body,
}
} }
} }

View File

@ -46,7 +46,7 @@ impl BlockSourceConfig {
.expect("home dir not found") .expect("home dir not found")
.join("hl") .join("hl")
.join("data") .join("data")
.join("evm_blocks_and_receipts"), .join("evm_block_and_receipts"),
}, },
block_source_from_node: None, block_source_from_node: None,
} }

View File

@ -81,12 +81,13 @@ impl BlockPoller {
.await .await
.ok_or(eyre::eyre!("Failed to find latest block number"))?; .ok_or(eyre::eyre!("Failed to find latest block number"))?;
if let Some(debug_cutoff_height) = debug_cutoff_height loop {
&& next_block_number > debug_cutoff_height { if let Some(debug_cutoff_height) = debug_cutoff_height &&
next_block_number > debug_cutoff_height
{
next_block_number = debug_cutoff_height; next_block_number = debug_cutoff_height;
} }
loop {
match block_source.collect_block(next_block_number).await { match block_source.collect_block(next_block_number).await {
Ok(block) => { Ok(block) => {
block_tx.send((next_block_number, block)).await?; block_tx.send((next_block_number, block)).await?;

View File

@ -27,7 +27,7 @@ impl LocalBlocksCache {
} }
pub fn get_block(&mut self, height: u64) -> Option<BlockAndReceipts> { pub fn get_block(&mut self, height: u64) -> Option<BlockAndReceipts> {
self.cache.remove(&height) self.cache.get(&height).cloned()
} }
pub fn get_path_for_height(&self, height: u64) -> Option<PathBuf> { pub fn get_path_for_height(&self, height: u64) -> Option<PathBuf> {

View File

@ -8,7 +8,7 @@ mod time_utils;
use self::{ use self::{
cache::LocalBlocksCache, cache::LocalBlocksCache,
file_ops::FileOperations, file_ops::FileOperations,
scan::{ScanOptions, Scanner}, scan::{LineStream, ScanOptions, Scanner},
time_utils::TimeUtils, time_utils::TimeUtils,
}; };
use super::{BlockSource, BlockSourceBoxed}; use super::{BlockSource, BlockSourceBoxed};
@ -52,6 +52,8 @@ pub struct HlNodeBlockSourceMetrics {
pub fetched_from_hl_node: Counter, pub fetched_from_hl_node: Counter,
/// How many times the HL node block source is fetched from the fallback /// How many times the HL node block source is fetched from the fallback
pub fetched_from_fallback: Counter, pub fetched_from_fallback: Counter,
/// How many times `try_collect_local_block` was faster than ingest loop
pub file_read_triggered: Counter,
} }
impl BlockSource for HlNodeBlockSource { impl BlockSource for HlNodeBlockSource {
@ -64,7 +66,9 @@ impl BlockSource for HlNodeBlockSource {
Box::pin(async move { Box::pin(async move {
let now = OffsetDateTime::now_utc(); let now = OffsetDateTime::now_utc();
if let Some(block) = Self::try_collect_local_block(local_blocks_cache, height).await { if let Some(block) =
Self::try_collect_local_block(&metrics, local_blocks_cache, height).await
{
Self::update_last_fetch(last_local_fetch, height, now).await; Self::update_last_fetch(last_local_fetch, height, now).await;
metrics.fetched_from_hl_node.increment(1); metrics.fetched_from_hl_node.increment(1);
return Ok(block); return Ok(block);
@ -120,6 +124,28 @@ impl BlockSource for HlNodeBlockSource {
} }
} }
struct CurrentFile {
path: PathBuf,
line_stream: Option<LineStream>,
}
impl CurrentFile {
pub fn from_datetime(dt: OffsetDateTime, root: &Path) -> Self {
let (hour, day_str) = (dt.hour(), TimeUtils::date_from_datetime(dt));
let path = root.join(HOURLY_SUBDIR).join(&day_str).join(format!("{}", hour));
Self { path, line_stream: None }
}
pub fn open(&mut self) -> eyre::Result<()> {
if self.line_stream.is_some() {
return Ok(());
}
self.line_stream = Some(LineStream::from_path(&self.path)?);
Ok(())
}
}
impl HlNodeBlockSource { impl HlNodeBlockSource {
async fn update_last_fetch( async fn update_last_fetch(
last_local_fetch: Arc<Mutex<Option<(u64, OffsetDateTime)>>>, last_local_fetch: Arc<Mutex<Option<(u64, OffsetDateTime)>>>,
@ -133,6 +159,7 @@ impl HlNodeBlockSource {
} }
async fn try_collect_local_block( async fn try_collect_local_block(
metrics: &HlNodeBlockSourceMetrics,
local_blocks_cache: Arc<Mutex<LocalBlocksCache>>, local_blocks_cache: Arc<Mutex<LocalBlocksCache>>,
height: u64, height: u64,
) -> Option<BlockAndReceipts> { ) -> Option<BlockAndReceipts> {
@ -142,9 +169,10 @@ impl HlNodeBlockSource {
} }
let path = u_cache.get_path_for_height(height)?; let path = u_cache.get_path_for_height(height)?;
info!("Loading block data from {:?}", path); info!("Loading block data from {:?}", path);
metrics.file_read_triggered.increment(1);
let mut line_stream = LineStream::from_path(&path).ok()?;
let scan_result = Scanner::scan_hour_file( let scan_result = Scanner::scan_hour_file(
&path, &mut line_stream,
&mut 0,
ScanOptions { start_height: 0, only_load_ranges: false }, ScanOptions { start_height: 0, only_load_ranges: false },
); );
u_cache.load_scan_result(scan_result); u_cache.load_scan_result(scan_result);
@ -165,9 +193,10 @@ impl HlNodeBlockSource {
} else { } else {
warn!("Failed to parse last line of file: {:?}", subfile); warn!("Failed to parse last line of file: {:?}", subfile);
} }
let mut line_stream =
LineStream::from_path(&subfile).expect("Failed to open line stream");
let mut scan_result = Scanner::scan_hour_file( let mut scan_result = Scanner::scan_hour_file(
&subfile, &mut line_stream,
&mut 0,
ScanOptions { start_height: cutoff_height, only_load_ranges: true }, ScanOptions { start_height: cutoff_height, only_load_ranges: true },
); );
scan_result.new_blocks.clear(); // Only store ranges, load data lazily scan_result.new_blocks.clear(); // Only store ranges, load data lazily
@ -188,15 +217,13 @@ impl HlNodeBlockSource {
} }
tokio::time::sleep(TAIL_INTERVAL).await; tokio::time::sleep(TAIL_INTERVAL).await;
}; };
let (mut hour, mut day_str, mut last_line) = let mut current_file = CurrentFile::from_datetime(dt, &root);
(dt.hour(), TimeUtils::date_from_datetime(dt), 0);
info!("Starting local ingest loop from height: {}", current_head); info!("Starting local ingest loop from height: {}", current_head);
loop { loop {
let hour_file = root.join(HOURLY_SUBDIR).join(&day_str).join(format!("{hour}")); let _ = current_file.open();
if hour_file.exists() { if let Some(line_stream) = &mut current_file.line_stream {
let scan_result = Scanner::scan_hour_file( let scan_result = Scanner::scan_hour_file(
&hour_file, line_stream,
&mut last_line,
ScanOptions { start_height: next_height, only_load_ranges: false }, ScanOptions { start_height: next_height, only_load_ranges: false },
); );
next_height = scan_result.next_expected_height; next_height = scan_result.next_expected_height;
@ -205,11 +232,8 @@ impl HlNodeBlockSource {
let now = OffsetDateTime::now_utc(); let now = OffsetDateTime::now_utc();
if dt + ONE_HOUR < now { if dt + ONE_HOUR < now {
dt += ONE_HOUR; dt += ONE_HOUR;
(hour, day_str, last_line) = (dt.hour(), TimeUtils::date_from_datetime(dt), 0); current_file = CurrentFile::from_datetime(dt, &root);
info!( info!("Moving to new file: {:?}", current_file.path);
"Moving to new file: {:?}",
root.join(HOURLY_SUBDIR).join(&day_str).join(format!("{hour}"))
);
continue; continue;
} }
tokio::time::sleep(TAIL_INTERVAL).await; tokio::time::sleep(TAIL_INTERVAL).await;

View File

@ -2,7 +2,7 @@ use crate::node::types::{BlockAndReceipts, EvmBlock};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::{ use std::{
fs::File, fs::File,
io::{BufRead, BufReader}, io::{BufRead, BufReader, Seek, SeekFrom},
ops::RangeInclusive, ops::RangeInclusive,
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
@ -25,6 +25,57 @@ pub struct ScanOptions {
pub struct Scanner; pub struct Scanner;
/// Stream for sequentially reading lines from a file.
///
/// This struct allows sequential iteration over lines over [Self::next] method.
/// It is resilient to cases where the line producer process is interrupted while writing:
/// - If a line is incomplete but still ends with a line ending, it is skipped: later, the fallback
/// block source will be used to retrieve the missing block.
/// - If a line does not end with a newline (i.e., the write was incomplete), the method returns
/// `None` to break out of the loop and avoid reading partial data.
/// - If a temporary I/O error occurs, the stream exits the loop without rewinding the cursor, which
/// will result in skipping ahead to the next unread bytes.
pub struct LineStream {
path: PathBuf,
reader: BufReader<File>,
}
impl LineStream {
pub fn from_path(path: &Path) -> std::io::Result<Self> {
let reader = BufReader::with_capacity(1024 * 1024, File::open(path)?);
Ok(Self { path: path.to_path_buf(), reader })
}
pub fn next(&mut self) -> Option<String> {
let mut line_buffer = vec![];
let Ok(size) = self.reader.read_until(b'\n', &mut line_buffer) else {
// Temporary I/O error; restart the loop
return None;
};
// Now cursor is right after the end of the line
// On UTF-8 error, skip the line
let Ok(mut line) = String::from_utf8(line_buffer) else {
return Some(String::new());
};
// If line is not completed yet, return None so that we can break the loop
if line.ends_with('\n') {
if line.ends_with('\r') {
line.pop();
}
line.pop();
return Some(line);
}
// info!("Line is not completed yet: {}", line);
if size != 0 {
self.reader.seek(SeekFrom::Current(-(size as i64))).unwrap();
}
None
}
}
impl Scanner { impl Scanner {
pub fn line_to_evm_block(line: &str) -> serde_json::Result<(BlockAndReceipts, u64)> { pub fn line_to_evm_block(line: &str) -> serde_json::Result<(BlockAndReceipts, u64)> {
let LocalBlockAndReceipts(_, parsed_block): LocalBlockAndReceipts = let LocalBlockAndReceipts(_, parsed_block): LocalBlockAndReceipts =
@ -35,31 +86,20 @@ impl Scanner {
Ok((parsed_block, height)) Ok((parsed_block, height))
} }
pub fn scan_hour_file(path: &Path, last_line: &mut usize, options: ScanOptions) -> ScanResult { pub fn scan_hour_file(line_stream: &mut LineStream, options: ScanOptions) -> ScanResult {
let lines: Vec<String> =
BufReader::new(File::open(path).expect("Failed to open hour file"))
.lines()
.collect::<Result<_, _>>()
.unwrap();
let skip = if *last_line == 0 { 0 } else { *last_line - 1 };
let mut new_blocks = Vec::new(); let mut new_blocks = Vec::new();
let mut last_height = options.start_height; let mut last_height = options.start_height;
let mut block_ranges = Vec::new(); let mut block_ranges = Vec::new();
let mut current_range: Option<(u64, u64)> = None; let mut current_range: Option<(u64, u64)> = None;
for (line_idx, line) in lines.iter().enumerate().skip(skip) { while let Some(line) = line_stream.next() {
if line_idx < *last_line || line.trim().is_empty() { match Self::line_to_evm_block(&line) {
continue;
}
match Self::line_to_evm_block(line) {
Ok((parsed_block, height)) => { Ok((parsed_block, height)) => {
if height >= options.start_height { if height >= options.start_height {
last_height = last_height.max(height); last_height = last_height.max(height);
if !options.only_load_ranges { if !options.only_load_ranges {
new_blocks.push(parsed_block); new_blocks.push(parsed_block);
} }
*last_line = line_idx;
} }
match current_range { match current_range {
@ -74,16 +114,17 @@ impl Scanner {
} }
} }
} }
Err(_) => warn!("Failed to parse line: {}...", line.get(0..50).unwrap_or(line)), Err(_) => warn!("Failed to parse line: {}...", line.get(0..50).unwrap_or(&line)),
} }
} }
if let Some((start, end)) = current_range { if let Some((start, end)) = current_range {
block_ranges.push(start..=end); block_ranges.push(start..=end);
} }
ScanResult { ScanResult {
path: path.to_path_buf(), path: line_stream.path.clone(),
next_expected_height: last_height + 1, next_expected_height: last_height + current_range.is_some() as u64,
new_blocks, new_blocks,
new_block_ranges: block_ranges, new_block_ranges: block_ranges,
} }

49
tests/run_tests.sh Normal file
View File

@ -0,0 +1,49 @@
#!/bin/bash
set -e
export ETH_RPC_URL="${ETH_RPC_URL:-wss://hl-archive-node.xyz}"
success() {
echo "Success: $1"
}
fail() {
echo "Failed: $1"
exit 1
}
ensure_cmd() {
command -v "$1" > /dev/null 2>&1 || fail "$1 is required"
}
ensure_cmd jq
ensure_cmd cast
ensure_cmd wscat
if [[ ! "$ETH_RPC_URL" =~ ^wss?:// ]]; then
fail "ETH_RPC_URL must be a websocket url"
fi
TITLE="Issue #78 - eth_getLogs should return system transactions"
cast logs \
--rpc-url "$ETH_RPC_URL" \
--from-block 15312567 \
--to-block 15312570 \
--address 0x9fdbda0a5e284c32744d2f17ee5c74b284993463 \
0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef \
| grep -q "0x00000000000000000000000020000000000000000000000000000000000000c5" \
&& success "$TITLE" || fail "$TITLE"
TITLE="Issue #78 - eth_getBlockByNumber should return the same logsBloom as official RPC"
OFFICIAL_RPC="https://rpc.hyperliquid.xyz/evm"
A=$(cast block 1394092 --rpc-url "$ETH_RPC_URL" -f logsBloom | md5sum)
B=$(cast block 1394092 --rpc-url "$OFFICIAL_RPC" -f logsBloom | md5sum)
echo node "$A"
echo rpc\ "$B"
[[ "$A" == "$B" ]] && success "$TITLE" || fail "$TITLE"
TITLE="eth_subscribe newHeads via wscat"
CMD='{"jsonrpc":"2.0","id":1,"method":"eth_subscribe","params":["newHeads"]}'
wscat -w 2 -c "$ETH_RPC_URL" -x "$CMD" | tail -1 | jq -r .params.result.nonce | grep 0x \
&& success "$TITLE" || fail "$TITLE"