Building a blockchain proof-of-concept used to mean days of manual configuration. You'd wrestle with genesis files, fumble through P2P discovery, debug EVM opcode mismatches, and still end up with a fragile testnet nobody wanted to maintain. What if you could go from zero to a working supply chain PoC on Besu in under 10 minutes?
That's what happens when you combine ChainLaunch's automated network provisioning with Claude Code's ability to write, compile, and deploy smart contracts from plain English prompts. The entire workflow — testnet creation, contract development, deployment, and lifecycle demo — collapses into a single conversation.
Most enterprise blockchain PoCs stall at the infrastructure phase. Teams spend so long debugging genesis configs and networking that stakeholders never see a working system. The tooling gap between "I want to test this idea" and "I have a running demo" kills momentum before it starts.
TL;DR: Claude Code + ChainLaunch spins up a 4-node Besu QBFT testnet, verifies it with ethers.js (contract deploy, state read/write, ETH transfers), then deploys a full supply chain contract with lifecycle operations — all from natural language prompts. You get a production-shaped PoC in minutes, not days.
Wondering whether Besu or Fabric fits your use case better? Read our Hyperledger Fabric vs Besu comparison.
What You'll Build
This tutorial produces a fully working, demo-ready supply chain PoC. Here's exactly what you'll have by the end.
Network layer:
- 4-node Besu QBFT testnet with automatic validator key generation
- Pre-funded accounts (1,000 ETH) for contract deployment and transactions
- RPC endpoints exposed per node for direct contract interaction
- Prometheus metrics and P2P discovery configured automatically
Network verification (Node.js):
- Connect to the QBFT network and inspect validators and peers
- Generate new wallets with private keys
- Deploy and interact with a smart contract using ethers.js
- Transfer ETH between accounts (send and receive)
Smart contract layer:
SupplyChain.sol— a Solidity contract tracking items through a supply chain- Functions:
createItem,updateStatus,transferOwnership,getItem,getHistoryEntry - On-chain provenance history per item, with actor, location, and timestamp
Demo scenario:
- Create a batch of "Organic Coffee Beans" at origin farm
- Update status from
CREATEDtoIN_TRANSITwith location tracking - Transfer ownership from farm to warehouse, then warehouse to retailer
- Query the full provenance history to verify chain of custody
For a deeper look at QBFT consensus, see our QBFT consensus guide.
What Do You Need Before Starting?
You'll need four things installed. All of them are standard developer tools.
- ChainLaunch — install via the quickstart guide or run
curl -fsSL https://chainlaunch.dev/deploy.sh | bash - Claude Code CLI —
npm install -g @anthropic-ai/claude-code(requires Node 18+) - Node.js 18+ — for network verification and ethers.js interaction (
brew install node) - Solidity compiler —
brew install solidityornpm install -g solc - Go 1.21+ — for the deployment and interaction tool (
brew install go)
That's it. No Docker Compose files to write. No Kubernetes manifests. No genesis block JSON to hand-craft. ChainLaunch handles all of that.
Step 1: How Do You Create the Besu Testnet?
A single command creates your entire 4-node QBFT testnet. ChainLaunch generates validator keys, builds the genesis block, configures P2P discovery, and starts all nodes as managed services. Manually configuring a QBFT network requires at minimum 8 separate configuration files per the Hyperledger Besu documentation — ChainLaunch reduces that to one CLI call.
First, find your key provider ID. On a fresh install the default database provider is typically ID 1, but verify with:
curl -s $CHAINLAUNCH_API_URL/key-providers -u $CHAINLAUNCH_USER:$CHAINLAUNCH_PASSWORD | jq '.[0].id'Then create the testnet:
chainlaunch testnet besu \
--name my-poc \
--nodes 4 \
--mode docker \
--provider-id 1 \
--initial-balance "0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266=0x3635C9ADC5DEA00000"Replace --provider-id 1 with whatever ID the command above returned.
Why 4 nodes? QBFT requires a minimum of 4 validators to tolerate a single Byzantine fault (the formula is n = 3f + 1). Drop below 4 and you lose fault tolerance entirely. For a PoC, 4 is the sweet spot — realistic enough to demo enterprise resilience, lean enough to run on a laptop.
The --initial-balance flag pre-funds an account in the genesis block. That hex value (0x3635C9ADC5DEA00000) equals 1,000 ETH. It's the account you'll use to deploy and call the contract — no faucets, no waiting.
Once the command completes, you'll see output like this:
Creating 4 validator keys...
Creating key for node besu-my-poc-1...
Key created: ID 5
Creating key for node besu-my-poc-2...
Key created: ID 6
Creating key for node besu-my-poc-3...
Key created: ID 7
Creating key for node besu-my-poc-4...
Key created: ID 8
Creating Besu network 'my-poc' with 4 validators...
Besu network created: ID 1
Creating 4 Besu nodes...
Creating Besu node besu-my-poc-1 with key ID 5...
Node created ID: 1
Creating Besu node besu-my-poc-2 with key ID 6...
Node created ID: 2
Creating Besu node besu-my-poc-3 with key ID 7...
Node created ID: 3
Creating Besu node besu-my-poc-4 with key ID 8...
Node created ID: 4
Besu testnet created successfully! Network ID: 1
Important: The RPC port for each node is assigned dynamically. To find the RPC URL for node 1, query the API:
RPC_PORT=$(curl -s $CHAINLAUNCH_API_URL/nodes/1 \
-u $CHAINLAUNCH_USER:$CHAINLAUNCH_PASSWORD | jq '.besuNode.rpcPort')
echo "RPC URL: http://localhost:$RPC_PORT"Use that URL for all subsequent commands. Verify the network is producing blocks:
curl -s -X POST http://localhost:$RPC_PORT \
-H "Content-Type: application/json" \
--data '{"jsonrpc":"2.0","method":"eth_blockNumber","params":[],"id":1}'You should see a block number incrementing every 5 seconds. That's your QBFT network running.
Want to compare Besu deployment approaches? See our Besu deployment tools comparison.
Step 2: How Do You Verify and Use the Network with Node.js?
Before building the supply chain contract, let's verify the network is working and learn the fundamentals: connecting, generating wallets, deploying a contract, reading and writing state, and transferring ETH. This step uses Node.js and ethers.js — the most common toolchain for Ethereum development.
Set Up the Project
Create a test project and install ethers.js:
mkdir besu-test && cd besu-test
npm init -y
npm install ethersConnect and Inspect the Network
Create test.mjs with the following. Replace RPC_PORT with the port you found in Step 1:
import { ethers } from "ethers";
const RPC_URL = "http://localhost:8553"; // Replace with your node's RPC port
// Pre-funded account from genesis (Hardhat default #0)
const FUNDED_KEY = "0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80";
async function main() {
const provider = new ethers.JsonRpcProvider(RPC_URL);
// Check network info
const network = await provider.getNetwork();
console.log("Chain ID:", network.chainId.toString());
const blockNumber = await provider.getBlockNumber();
console.log("Block number:", blockNumber);
// Connect the pre-funded wallet
const wallet = new ethers.Wallet(FUNDED_KEY, provider);
console.log("\nFunded wallet:");
console.log(" Address:", wallet.address);
const balance = await provider.getBalance(wallet.address);
console.log(" Balance:", ethers.formatEther(balance), "ETH");
// Generate a brand new wallet
const newWallet = ethers.Wallet.createRandom().connect(provider);
console.log("\nNew wallet:");
console.log(" Address:", newWallet.address);
console.log(" Private key:", newWallet.privateKey);
// Check QBFT validators
const validators = await provider.send(
"qbft_getValidatorsByBlockNumber", ["latest"]
);
console.log("\nQBFT validators:", validators.length);
validators.forEach((v, i) => console.log(` ${i + 1}. ${v}`));
// Check peer connectivity
const peerCount = await provider.send("net_peerCount", []);
console.log("Peer count:", parseInt(peerCount, 16));
}
main().catch(console.error);Run it:
node test.mjsYou should see output like this:
Chain ID: 1337
Block number: 22806
Funded wallet:
Address: 0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266
Balance: 1000.0 ETH
New wallet:
Address: 0x94Ff489cba2fB5f17e9BAC432C5f590cBcf0609c
Private key: 0xa62c5f1a03bbf658b93982508338eec65b8812e6...
QBFT validators: 4
1. 0x2c8eb495d4fe51094a2b4faa6ea7c50c3b2589a5
2. 0x2d137943e692fdc77dd3799ed1ee0b7b31859343
3. 0x7bc5add7496dec259befb11223262f09872c8f69
4. 0xdbdabf6f6e4621a2290a74a77a193e60fc652659
Peer count: 3
Four validators, three peers (each node sees the other three), blocks incrementing every 5 seconds, and 1,000 ETH in the pre-funded account. The network is live.
Deploy a Smart Contract
Now deploy a simple storage contract. This compiles Solidity in-process using the solc npm package and deploys it with a signed transaction. Add solc to your project:
npm install solcCreate deploy-and-use.mjs:
import { ethers } from "ethers";
import solc from "solc";
const RPC_URL = "http://localhost:8553";
const FUNDED_KEY =
"0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80";
function compileSolidity() {
const source = `
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;
contract SimpleStorage {
uint256 private _value;
function set(uint256 x) external {
_value = x;
}
function get() external view returns (uint256) {
return _value;
}
}`;
const input = {
language: "Solidity",
sources: { "SimpleStorage.sol": { content: source } },
settings: {
evmVersion: "berlin", // Must match genesis config
outputSelection: { "*": { "*": ["abi", "evm.bytecode.object"] } },
},
};
const output = JSON.parse(solc.compile(JSON.stringify(input)));
if (output.errors?.some((e) => e.severity === "error")) {
throw new Error(output.errors.map((e) => e.message).join("\n"));
}
const contract = output.contracts["SimpleStorage.sol"]["SimpleStorage"];
return {
abi: contract.abi,
bytecode: "0x" + contract.evm.bytecode.object,
};
}
async function main() {
const provider = new ethers.JsonRpcProvider(RPC_URL);
const wallet = new ethers.Wallet(FUNDED_KEY, provider);
const blockNumber = await provider.getBlockNumber();
// 1. Compile the contract
console.log("Compiling SimpleStorage...");
const { abi, bytecode } = compileSolidity();
console.log("Bytecode:", bytecode.length / 2, "bytes");
// 2. Deploy
console.log("\nDeploying...");
const factory = new ethers.ContractFactory(abi, bytecode, wallet);
const contract = await factory.deploy();
console.log("Deploy tx:", contract.deploymentTransaction().hash);
await contract.waitForDeployment();
const addr = await contract.getAddress();
console.log("Contract deployed at:", addr);
// 3. Read initial value
const v0 = await contract.get();
console.log("\nget() =", v0.toString(), "(initial, should be 0)");
// 4. Write a value
const tx1 = await contract.set(42);
console.log("set(42) tx:", tx1.hash);
await tx1.wait();
const v1 = await contract.get();
console.log("get() =", v1.toString(), "(should be 42)");
// 5. Write another value
const tx2 = await contract.set(1337);
await tx2.wait();
const v2 = await contract.get();
console.log("get() =", v2.toString(), "(should be 1337)");
// 6. Transfer ETH to a new wallet
console.log("\n--- ETH Transfers ---");
const newWallet = ethers.Wallet.createRandom().connect(provider);
console.log("New wallet:", newWallet.address);
const sendTx = await wallet.sendTransaction({
to: newWallet.address,
value: ethers.parseEther("1.0"),
});
await sendTx.wait();
const newBal = await provider.getBalance(newWallet.address);
console.log("Sent 1 ETH. New wallet balance:", ethers.formatEther(newBal), "ETH");
// 7. Transfer ETH back
const sendBack = await newWallet.sendTransaction({
to: wallet.address,
value: ethers.parseEther("0.5"),
});
await sendBack.wait();
console.log("New wallet sent 0.5 ETH back.");
// Summary
const endBlock = await provider.getBlockNumber();
console.log("\nBlocks during test:", endBlock - blockNumber);
console.log("\nAll operations verified!");
}
main().catch(console.error);Run it:
node deploy-and-use.mjsExpected output:
Compiling SimpleStorage...
Bytecode: 368 bytes
Deploying...
Deploy tx: 0x4bc25f01784...
Contract deployed at: 0xa513E6E4b8f2a923D98304ec87F64353C4D5C853
get() = 0 (initial, should be 0)
set(42) tx: 0x0030808fb7c...
get() = 42 (should be 42)
get() = 1337 (should be 1337)
--- ETH Transfers ---
New wallet: 0xcB1e7c72E843e991a24C29e54C0EA42244311311
Sent 1 ETH. New wallet balance: 1.0 ETH
New wallet sent 0.5 ETH back.
Blocks during test: 5
All operations verified!
In under a minute you've compiled Solidity, deployed a contract, written and read on-chain state, and transferred ETH between wallets — all on the QBFT testnet that ChainLaunch provisioned in Step 1.
Two critical things to note:
-
evmVersion: "berlin"is required. ChainLaunch's genesis setsberlinBlock: 0. If you compile forshanghai(the default in recent solc versions), the bytecode uses thePUSH0opcode which Berlin doesn't support. The deploy transaction succeeds but the contract silently fails at runtime. This is the single most common debugging trap with private Besu networks. -
Besu requires signed transactions. Unlike Ganache or Hardhat, Besu doesn't support
eth_sendTransaction(unsigned). Always use a private key with ethers.jsWalletor the equivalent in your framework. The pre-funded key0xac0974...is the Hardhat default account #0 — it's only funded because ChainLaunch puts it in the genesisalloc.
What You Can Do From Here
Your network is now confirmed working. You can:
- Connect MetaMask using RPC URL
http://localhost:8553and Chain ID1337 - Use Hardhat or Foundry by configuring a custom network pointing to your RPC endpoint
- Build a frontend with ethers.js using the same patterns shown above
- Deploy any EVM contract — just remember to target Berlin EVM
Now let's build something more interesting: a full supply chain contract.
Step 3: How Do You Write the Solidity Contract?
The SupplyChain.sol contract is the core of your PoC. It stores item state on-chain, records every status change and ownership transfer as a history entry, and emits events so off-chain systems can react in real time. Claude Code can generate a contract like this from a single prompt: "create a supply chain tracker smart contract in Solidity that records item provenance history."
Here's the full contract:
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.19;
contract SupplyChain {
struct HistoryEntry {
string action;
string actor;
string location;
string notes;
uint256 timestamp;
}
struct Item {
string name;
string owner;
string status;
string location;
bool exists;
HistoryEntry[] history;
}
mapping(string => Item) private items;
event ItemCreated(string indexed itemId, string name, string owner);
event StatusUpdated(string indexed itemId, string newStatus, string location);
event OwnershipTransferred(string indexed itemId, string from, string to);
function createItem(
string memory id,
string memory name,
string memory owner,
string memory location
) public {
require(!items[id].exists, "Item already exists");
Item storage item = items[id];
item.name = name;
item.owner = owner;
item.status = "CREATED";
item.location = location;
item.exists = true;
item.history.push(HistoryEntry(
"CREATED", owner, location, "Item registered on chain", block.timestamp
));
emit ItemCreated(id, name, owner);
}
function updateStatus(
string memory id,
string memory newStatus,
string memory location,
string memory notes
) public {
require(items[id].exists, "Item does not exist");
Item storage item = items[id];
item.status = newStatus;
item.location = location;
item.history.push(HistoryEntry(
"STATUS_UPDATE", item.owner, location, notes, block.timestamp
));
emit StatusUpdated(id, newStatus, location);
}
function transferOwnership(
string memory id,
string memory newOwner,
string memory location,
string memory notes
) public {
require(items[id].exists, "Item does not exist");
Item storage item = items[id];
string memory oldOwner = item.owner;
item.owner = newOwner;
item.location = location;
item.history.push(HistoryEntry(
"TRANSFER", oldOwner, location, notes, block.timestamp
));
emit OwnershipTransferred(id, oldOwner, newOwner);
}
function getItem(string memory id) public view returns (
string memory name, string memory owner,
string memory status, string memory location,
uint256 historyCount
) {
require(items[id].exists, "Item does not exist");
Item storage item = items[id];
return (item.name, item.owner, item.status, item.location, item.history.length);
}
function getHistoryEntry(string memory id, uint256 index) public view returns (
string memory action, string memory actor,
string memory location, string memory notes,
uint256 timestamp
) {
require(items[id].exists, "Item does not exist");
require(index < items[id].history.length, "Index out of bounds");
HistoryEntry storage entry = items[id].history[index];
return (entry.action, entry.actor, entry.location, entry.notes, entry.timestamp);
}
}This is the actual contract we tested during development of this tutorial. String-keyed IDs (instead of auto-incrementing integers) make it easier to demo — you can use meaningful identifiers like "COFFEE-LOT-2240" instead of opaque numbers.
Step 4: How Do You Compile and Deploy?
Compilation requires one critical flag that most tutorials skip. ChainLaunch's testnet genesis sets berlinBlock: 0, which means the EVM only supports opcodes up to Berlin. If you compile with a newer EVM target (like london or the default shanghai), the resulting bytecode may contain unsupported opcodes — PUSH0 (0x5f) in Shanghai, or BASEFEE in London. The deploy transaction will appear to succeed (status 0x1) but eth_getCode returns empty bytecode. This silent failure can burn hours of debugging.
The fix: compile with --evm-version berlin to match your genesis:
solc --evm-version berlin --bin --abi SupplyChain.sol -o build/This produces two files in build/: SupplyChain.bin (bytecode) and SupplyChain.abi (ABI). You need both for deployment.
We've hit this --evm-version issue ourselves more than once. ChainLaunch uses berlinBlock: 0 in genesis because Berlin is the most widely tested compatibility target across enterprise Besu deployments. Always match your --evm-version flag to your genesis config.
Here's a Go tool that handles both deployment and contract interaction. Create a directory for it, initialize a Go module, and add the code:
mkdir -p deployer && cd deployer
go mod init deployer
go get github.com/ethereum/go-ethereumCreate main.go:
package main
import (
"context"
"encoding/hex"
"fmt"
"log"
"math/big"
"os"
"strconv"
"strings"
"github.com/ethereum/go-ethereum/accounts/abi"
"github.com/ethereum/go-ethereum/accounts/abi/bind"
"github.com/ethereum/go-ethereum/common"
"github.com/ethereum/go-ethereum/crypto"
"github.com/ethereum/go-ethereum/ethclient"
)
func main() {
if len(os.Args) < 2 {
fmt.Println("Usage: go run main.go <command> [args...]")
fmt.Println("Commands: deploy, create, update, transfer, get, history")
os.Exit(1)
}
rpcURL := os.Getenv("RPC_URL")
if rpcURL == "" {
log.Fatal("RPC_URL environment variable is not set")
}
client, err := ethclient.Dial(rpcURL)
if err != nil {
log.Fatalf("Failed to connect: %v", err)
}
abiBytes, _ := os.ReadFile("build/SupplyChain.abi")
parsedABI, _ := abi.JSON(strings.NewReader(string(abiBytes)))
switch os.Args[1] {
case "deploy":
deploy(client, parsedABI)
case "create":
// create <contract> <id> <name> <owner> <location>
callContract(client, parsedABI, os.Args[2], "createItem", os.Args[3], os.Args[4], os.Args[5], os.Args[6])
case "update":
// update <contract> <id> <status> <location> <notes>
callContract(client, parsedABI, os.Args[2], "updateStatus", os.Args[3], os.Args[4], os.Args[5], os.Args[6])
case "transfer":
// transfer <contract> <id> <newOwner> <location> <notes>
callContract(client, parsedABI, os.Args[2], "transferOwnership", os.Args[3], os.Args[4], os.Args[5], os.Args[6])
case "get":
// get <contract> <id>
queryItem(client, parsedABI, os.Args[2], os.Args[3])
case "history":
// history <contract> <id> <index>
queryHistory(client, parsedABI, os.Args[2], os.Args[3], os.Args[4])
default:
log.Fatalf("Unknown command: %s", os.Args[1])
}
}
func getAuth(client *ethclient.Client) *bind.TransactOpts {
privateKey, err := crypto.HexToECDSA(strings.TrimPrefix(
os.Getenv("DEPLOYER_PRIVATE_KEY"), "0x",
))
if err != nil {
log.Fatalf("Failed to load private key: %v", err)
}
chainID, _ := client.ChainID(context.Background())
auth, _ := bind.NewKeyedTransactorWithChainID(privateKey, chainID)
return auth
}
func deploy(client *ethclient.Client, parsedABI abi.ABI) {
auth := getAuth(client)
binBytes, _ := os.ReadFile("build/SupplyChain.bin")
bytecode, _ := hex.DecodeString(strings.TrimSpace(string(binBytes)))
address, tx, _, err := bind.DeployContract(auth, parsedABI, bytecode, client)
if err != nil {
log.Fatalf("Deploy failed: %v", err)
}
fmt.Printf("Contract deployed at: %s\n", address.Hex())
fmt.Printf("TX hash: %s\n", tx.Hash().Hex())
}
func callContract(client *ethclient.Client, parsedABI abi.ABI, contractAddr, method string, args ...string) {
auth := getAuth(client)
addr := common.HexToAddress(contractAddr)
contract := bind.NewBoundContract(addr, parsedABI, client, client, client)
ifaces := make([]interface{}, len(args))
for i, s := range args {
ifaces[i] = s
}
tx, err := contract.Transact(auth, method, ifaces...)
if err != nil {
log.Fatalf("Transaction failed: %v", err)
}
fmt.Printf("%s TX sent: %s\n", method, tx.Hash().Hex())
receipt, err := bind.WaitMined(context.Background(), client, tx)
if err != nil {
log.Fatalf("Wait for mining failed: %v", err)
}
fmt.Printf(" Status: %d (1=success)\n", receipt.Status)
fmt.Printf(" Block: %d\n", receipt.BlockNumber.Uint64())
}
func queryItem(client *ethclient.Client, parsedABI abi.ABI, contractAddr, itemID string) {
addr := common.HexToAddress(contractAddr)
contract := bind.NewBoundContract(addr, parsedABI, client, client, client)
var result []interface{}
err := contract.Call(&bind.CallOpts{}, &result, "getItem", itemID)
if err != nil {
log.Fatalf("Query failed: %v", err)
}
fmt.Printf("Item: %s\n", itemID)
fmt.Printf(" Name: %s\n", result[0].(string))
fmt.Printf(" Owner: %s\n", result[1].(string))
fmt.Printf(" Status: %s\n", result[2].(string))
fmt.Printf(" Location: %s\n", result[3].(string))
fmt.Printf(" History: %d entries\n", result[4].(*big.Int).Int64())
}
func queryHistory(client *ethclient.Client, parsedABI abi.ABI, contractAddr, itemID, indexStr string) {
addr := common.HexToAddress(contractAddr)
contract := bind.NewBoundContract(addr, parsedABI, client, client, client)
index, _ := strconv.ParseUint(indexStr, 10, 64)
idx := new(big.Int).SetUint64(index)
var result []interface{}
err := contract.Call(&bind.CallOpts{}, &result, "getHistoryEntry", itemID, idx)
if err != nil {
log.Fatalf("Query failed: %v", err)
}
fmt.Printf("History[%d] for %s:\n", index, itemID)
fmt.Printf(" Action: %s\n", result[0].(string))
fmt.Printf(" Actor: %s\n", result[1].(string))
fmt.Printf(" Location: %s\n", result[2].(string))
fmt.Printf(" Notes: %s\n", result[3].(string))
fmt.Printf(" Timestamp: %s\n", result[4].(*big.Int).String())
}Copy the compiled artifacts into the deployer directory and deploy:
cp -r build/ deployer/build/
cd deployer
export DEPLOYER_PRIVATE_KEY="ac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80"
export RPC_URL="http://localhost:$RPC_PORT"
go run main.go deployYou'll get the contract address back within seconds. Save it — you'll need it for the next step.
export CONTRACT_ADDRESS="<address from deploy output>"Step 5: How Do You Run the Full Supply Chain Lifecycle?
Now you run the demo that stakeholders actually care about. The scenario: a batch of "Organic Coffee Beans" originates at a farm in Colombia, moves through a warehouse in Miami, and ends at a retailer in New York. Every step is recorded on-chain.
The deployer tool from Step 4 supports all the contract operations. Make sure RPC_URL, DEPLOYER_PRIVATE_KEY, and CONTRACT_ADDRESS are still set from the previous step, then run from the deployer/ directory:
# Create the coffee bean item
go run main.go create $CONTRACT_ADDRESS \
"COFFEE-LOT-2240" "Organic Coffee Beans" "FincaElParaiso" "Huila, Colombia"
# Update status: shipped from farm
go run main.go update $CONTRACT_ADDRESS \
"COFFEE-LOT-2240" "IN_TRANSIT" "Miami Port, FL" "Cleared customs, temp 18C maintained"
# Transfer ownership to warehouse operator
go run main.go transfer $CONTRACT_ADDRESS \
"COFFEE-LOT-2240" "ColdLinkWarehouse" "Miami Port, FL" "Bill of Lading #BL-44892"
# Query the item state
go run main.go get $CONTRACT_ADDRESS "COFFEE-LOT-2240"
# Check history entry 0
go run main.go history $CONTRACT_ADDRESS "COFFEE-LOT-2240" 0The output shows a complete, tamper-evident audit trail. Every status change and ownership transfer is recorded with the identity of the actor who made it. That's the core value proposition for enterprise supply chain — you don't need to trust any single party's records because the blockchain holds the authoritative history.
How Does Claude Code Orchestrate All of This?
This is where the PoC speed multiplier comes from. Claude Code doesn't just write code. It calls CLI tools, reads their output, adapts when things go wrong, and chains the entire workflow together. Here's what a real session looks like:
You: Create a 4-node Besu QBFT testnet using ChainLaunch, then write
and deploy a supply chain smart contract. Pre-fund account
0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266 with 1000 ETH.
Run a full lifecycle demo with "Organic Coffee Beans".
Claude Code:
> Running: chainlaunch testnet besu --name my-poc --nodes 4
--mode docker --initial-balance "0xf39...=0x3635C9ADC5DEA00000"
> Network ready. 4 nodes running. Querying node RPC port...
> Writing SupplyChain.sol...
> Running: solc --evm-version berlin --bin --abi SupplyChain.sol -o build/
> Writing deploy.go...
> Running: go run deploy.go
> Contract deployed at 0x5FbDB2315678afecb367f032d93F642f64180aa3
> Running lifecycle demo...
> Created: "Organic Coffee Beans, Lot #2240" at Huila, Colombia
> Updated: IN_TRANSIT at Miami Port, FL
> Transferred ownership to ColdLinkWarehouse
> Queried history: 3 entries verified
The key insight: Claude Code treats ChainLaunch as a tool it can call. It reads the network output to get RPC endpoints, passes them to the compiler and deployment scripts, and adapts when something goes wrong. For example, Claude Code knows about the --evm-version berlin requirement and applies it automatically.
In our own PoC sprints, we've run this full workflow — testnet to deployed contract to lifecycle demo — in under 8 minutes on a standard MacBook Pro. Without automated tooling, the same workflow took us 2-3 hours, with most of that time spent on genesis config debugging and P2P discovery issues.
Is every PoC this clean? No. Real enterprise PoCs add complexity: private transactions, permissioning, integration with existing systems. But starting from a working baseline in 8 minutes means you spend your time on business logic, not infrastructure plumbing.
Frequently Asked Questions
Can I use a different consensus mechanism with ChainLaunch?
QBFT is the recommended consensus for enterprise Besu deployments. It provides deterministic finality and doesn't require mining, per the Hyperledger Besu docs. IBFT2 is also supported, though QBFT is preferred for new deployments. Proof of Authority (Clique) is available for simpler test scenarios where BFT guarantees aren't needed. See our QBFT consensus guide for the full comparison.
How do I add more nodes to the network after creation?
Use chainlaunch nodes create to add validators or full nodes to an existing network. For QBFT, adding a validator requires a governance vote from existing validators through an on-chain proposal. ChainLaunch automates the voting transaction, but you still need quorum. Non-voting full nodes can be added without governance.
What about moving this PoC to production?
ChainLaunch supports --mode docker for containerized deployments and integrates with Prometheus for monitoring. Production features — automated backups, RBAC, SSO, audit logging — are available in ChainLaunch Pro. The PoC you built here is already production-shaped: same contract, same network topology, just different infrastructure targets.
Why do you show both Node.js and Go?
Step 2 uses Node.js with ethers.js because it's the fastest way to verify your network and prototype contract interactions — most developers already have Node installed and ethers.js is the dominant Ethereum library. Steps 4-5 use Go because it's a natural fit for production backend integrations (common in enterprise Besu deployments) and shows how the same contract works across toolchains. The evmVersion: "berlin" requirement applies to both — it's a Solidity compiler setting, not a runtime concern. Use whatever your team already knows.
Does ChainLaunch support private transactions on Besu?
Private transactions via Tessera (Besu's privacy manager) are on the ChainLaunch roadmap. For PoC scenarios requiring confidential transfers between known parties, application-layer encryption of sensitive fields before writing to a public contract is a practical workaround most enterprise teams use today.
What Should You Do Next?
A Besu supply chain PoC that used to take days now takes minutes. That's the practical result of combining automated network provisioning with AI-assisted contract development.
Here's what you built: real BFT consensus, on-chain provenance, pre-funded accounts, and a full lifecycle demo. Claude Code handled the boilerplate so you could focus on the business logic.
Key things to remember as you build on this:
- Always compile with
--evm-version berlinto match ChainLaunch's Berlin-genesis config — this is the most common gotcha - QBFT needs 4+ validators — don't cut corners for enterprise demos
- ChainLaunch's RPC endpoints work with any Ethereum tooling: ethers.js, web3.py, Hardhat, Foundry, or MetaMask
- String-keyed items demo better than auto-incrementing IDs — stakeholders want to see meaningful identifiers
Ready to try it? Install ChainLaunch and run your first chainlaunch testnet besu command. For production features like RBAC, SSO, and automated backups, check out ChainLaunch Pro. Compare deployment options in our Besu deployment tools comparison.
David Viejo is the founder of ChainLaunch and a Hyperledger Foundation contributor. He created the Bevel Operator Fabric project and has been building blockchain infrastructure tooling since 2020.