Disclaimer: Some internal fields and implementation details are omitted here for security reasons.
Checkpoint Protection asks visitors to solve a quick puzzle before letting them through, cutting down on automated traffic while keeping the experience smooth for real users.
When you navigate to a protected page, the middleware checks for a valid token cookie (__Host-checkpoint_token).
/api/pow/challenge?id=REQUEST_ID. This payload includes a random challenge, salt, difficulty, and hidden parameters.
SHA‑256(challenge + salt + nonce) meets the difficulty.
/api/pow/verify along with the request ID.
Challenges are generated using cryptographically secure random bytes combined with a salt for additional entropy:
func generateChallenge() (string, string) {
// Generate a random challenge
randomBytes := make([]byte, 16)
_, err := cryptorand.Read(randomBytes)
if err != nil {
log.Fatalf("CRITICAL: Failed to generate secure random challenge: %v", err)
}
// Generate a random salt for additional entropy
saltBytes := make([]byte, saltLength)
_, err = cryptorand.Read(saltBytes)
if err != nil {
log.Fatalf("CRITICAL: Failed to generate secure random salt: %v", err)
}
return hex.EncodeToString(randomBytes), hex.EncodeToString(saltBytes)
}
Security Note: The system uses Go's crypto/rand package for secure random number generation, ensuring challenges cannot be predicted even by sophisticated attackers.
Challenges are stored with a unique request ID and include parameters for verification:
type ChallengeParams struct {
Challenge string `json:"challenge"` // Base64 encoded
Salt string `json:"salt"` // Base64 encoded
Difficulty int `json:"difficulty"`
ExpiresAt time.Time `json:"expires_at"`
ClientIP string `json:"-"`
PoSSeed string `json:"pos_seed"` // Hex encoded
}
When a client requests a challenge, the parameters are delivered in an obfuscated format to prevent automated analysis:
{
"a": "base64-encoded-challenge",
"b": "base64-encoded-salt",
"c": 4,
"d": "hex-encoded-pos-seed"
}
The system performs a two-step verification process:
Verification checks that the hash of the challenge, salt, and nonce combination has the required number of leading zeros:
func verifyProofOfWork(challenge, salt, nonce string, difficulty int) bool {
input := challenge + salt + nonce
hash := calculateHash(input)
// Check if the hash has the required number of leading zeros
prefix := strings.Repeat("0", difficulty)
return strings.HasPrefix(hash, prefix)
}
func calculateHash(input string) string {
hash := sha256.Sum256([]byte(input))
return hex.EncodeToString(hash[:])
}
In addition to the computational work, clients must prove they can allocate and manipulate significant memory resources:
The server verifies:
The dual-verification approach makes the system resistant to specialized hardware acceleration. While the computational proof can be solved by ASICs or GPUs, the memory proof is specifically designed to be inefficient on such hardware.
Checkpoint tokens contain various fields for security and binding:
| Field | Description | Purpose |
|---|---|---|
| Nonce | The solution to the challenge | Verification proof |
| ExpiresAt | Token expiration timestamp | Enforces time-limited access (24 hours) |
| ClientIP | Hashed full client IP | Device binding (first 8 bytes of SHA-256) |
| UserAgent | Hashed user agent | Browser binding |
| BrowserHint | Derived from Sec-CH-UA headers | Additional client identity verification |
| Entropy | Random data | Prevents token prediction/correlation |
| Created | Token creation timestamp | Token age tracking |
| LastVerified | Last verification timestamp | Token usage tracking |
| Signature | HMAC signature | Prevents token forgery |
| TokenFormat | Version number | Backward compatibility support |
type CheckpointToken struct {
Nonce string `json:"g"` // Nonce
ExpiresAt time.Time `json:"exp"`
ClientIP string `json:"cip,omitempty"`
UserAgent string `json:"ua,omitempty"`
BrowserHint string `json:"bh,omitempty"`
Entropy string `json:"ent,omitempty"`
Created time.Time `json:"crt"`
LastVerified time.Time `json:"lvf,omitempty"`
Signature string `json:"sig,omitempty"`
TokenFormat int `json:"fmt"`
}
Every token is cryptographically signed using HMAC-SHA256 with a server-side secret:
func computeTokenSignature(token CheckpointToken, tokenBytes []byte) string {
tokenCopy := token
tokenCopy.Signature = "" // Ensure signature field is empty for signing
tokenToSign, _ := json.Marshal(tokenCopy)
h := hmac.New(sha256.New, hmacSecret)
h.Write(tokenToSign)
return hex.EncodeToString(h.Sum(nil))
}
func verifyTokenSignature(token CheckpointToken, tokenBytes []byte) bool {
if token.Signature == "" {
return false
}
expectedSignature := computeTokenSignature(token, tokenBytes)
return hmac.Equal([]byte(token.Signature), []byte(expectedSignature))
}
Successfully verified tokens are stored in a persistent store for faster validation:
// TokenStore manages persistent storage of verified tokens
type TokenStore struct {
VerifiedTokens map[string]time.Time `json:"verified_tokens"`
Mutex sync.RWMutex `json:"-"`
FilePath string `json:"-"`
}
// Each token is identified by a unique hash
func calculateTokenHash(token CheckpointToken) string {
data := fmt.Sprintf("%s:%s:%d",
token.Nonce, // Use nonce as part of the key
token.Entropy, // Use entropy as part of the key
token.Created.UnixNano()) // Use creation time
hash := sha256.Sum256([]byte(data))
return hex.EncodeToString(hash[:])
}
The Checkpoint system can be configured through these constants:
| Constant | Description | Default |
|---|---|---|
| Difficulty | Number of leading zeros required in the hash | 4 |
| TokenExpiration | Duration for which a token is valid | 24 hours |
| Cookie Name | __Host-checkpoint_token | The cookie name storing the issued token |
| maxAttemptsPerHour | Rate limit for verification attempts | 10 |
| saltLength | Length of the random salt in bytes | 16 |
| maxNonceAge | Time before nonces are cleaned up | 24 hours |
| challengeExpiration | Time before a challenge expires | 5 minutes |
Warning: Increasing the Difficulty significantly increases the computational work required by clients. A value that's too high may result in poor user experience, especially on mobile devices.
const (
// Difficulty defines the number of leading zeros required in hash
Difficulty = 4
// TokenExpiration sets token validity period
TokenExpiration = 24 * time.Hour
// CookieName defines the cookie name for tokens
CookieName = "__Host-checkpoint_token"
// Max verification attempts per IP per hour
maxAttemptsPerHour = 10
// Salt length for additional entropy
saltLength = 16
)
The Checkpoint system provides a middleware handler that automatically protects HTML routes while bypassing API routes and static assets:
This middleware is optimized for HTML routes, with smart content-type detection and automatic exclusions for static assets and API endpoints.
// HTMLCheckpointMiddleware handles challenges specifically for HTML pages
func HTMLCheckpointMiddleware() fiber.Handler {
return func(c *fiber.Ctx) error {
// Allow certain paths to bypass verification
path := c.Path()
if path == "/video-player" || path == "/video-player.html" || strings.HasPrefix(path, "/videos/") {
return c.Next()
}
if strings.HasPrefix(path, "/api") {
return c.Next()
}
if path == "/favicon.ico" || (strings.Contains(path, ".") && !strings.HasSuffix(path, ".html")) {
return c.Next()
}
// Only apply to HTML routes
isHtmlRoute := strings.HasSuffix(path, ".html") || path == "/" ||
(len(path) > 0 && !strings.Contains(path, "."))
if !isHtmlRoute {
return c.Next()
}
token := c.Cookies(CookieName)
if token != "" {
valid, err := validateToken(token, c)
if err == nil && valid {
return c.Next()
}
}
return serveInterstitial(c)
}
}
// Enable HTML checkpoint protection for all routes
app.Use(middleware.HTMLCheckpointMiddleware())
// API group with verification endpoints
api := app.Group("/api")
// Verification endpoints
api.Post("/pow/verify", middleware.VerifyCheckpointHandler)
api.Get("/pow/challenge", middleware.GetCheckpointChallengeHandler)
// Example protected API endpoint
api.Get("/protected", func(c *fiber.Ctx) error {
// Access is already verified by cookie presence
return c.JSON(fiber.Map{
"message": "You have accessed the protected endpoint!",
"time": time.Now(),
})
})
The client-side implementation is handled by the interstitial page and its associated JavaScript:
/api/pow/challenge?id=REQUEST_ID
/api/pow/verify endpoint
Computational proof is handled by Web Workers to avoid freezing the UI:
function workerFunction() {
self.onmessage = function(e) {
const { type, data } = e.data;
if (type === 'pow') {
// PoW calculation
const { challenge, salt, startNonce, endNonce, target, batchId } = data;
let count = 0;
let solution = null;
processNextNonce(startNonce);
function processNextNonce(nonce) {
const input = String(challenge) + String(salt) + nonce.toString();
const msgBuffer = new TextEncoder().encode(input);
crypto.subtle.digest('SHA-256', msgBuffer)
.then(hashBuffer => {
const hashArray = Array.from(new Uint8Array(hashBuffer));
const result = hashArray.map(b =>
b.toString(16).padStart(2, '0')).join('');
count++;
if (result.startsWith(target)) {
solution = { nonce: nonce.toString(), found: true };
self.postMessage({
type: 'pow_result',
solution: solution,
count: count,
batchId: batchId
});
return;
}
if (nonce < endNonce && !solution) {
setTimeout(() => processNextNonce(nonce + 1), 0);
} else if (!solution) {
self.postMessage({
type: 'pow_result',
solution: null,
count: count,
batchId: batchId
});
}
});
}
}
};
}
The memory proof allocates and manipulates large buffers to verify client capabilities:
async function runProofOfSpace(seedHex, isDecoy) {
// Deterministic memory size (48MB to 160MB) based on seed
const minMB = 48, maxMB = 160;
let seedInt = parseInt(seedHex.slice(0, 8), 16);
const CHUNK_MB = minMB + (seedInt % (maxMB - minMB + 1));
const CHUNK_SIZE = CHUNK_MB * 1024 * 1024;
// Chunk memory for controlled allocation
const chunkCount = 4 + (seedInt % 5); // 4-8 chunks
const chunkSize = Math.floor(CHUNK_SIZE / chunkCount);
// Run the proof multiple times to verify consistency
const runs = 3;
const hashes = [];
const times = [];
// For each run...
for (let r = 0; r < runs; r++) {
// Generate deterministic chunk order
let prng = seededPRNG(seedHex + r.toString(16));
let order = Array.from({length: chunkCount}, (_, i) => i);
for (let i = order.length - 1; i > 0; i--) {
const j = prng() % (i + 1);
[order[i], order[j]] = [order[j], order[i]];
}
// Allocate and fill memory buffer
let t0 = performance.now();
let buf = new ArrayBuffer(CHUNK_SIZE);
let view = new Uint8Array(buf);
// Fill buffer with deterministic pattern
for (let c = 0; c < chunkCount; c++) {
let chunkIdx = order[c];
let start = chunkIdx * chunkSize;
let end = (chunkIdx + 1) * chunkSize;
for (let i = start; i < end; i += 4096) {
view[i] = prng() & 0xFF;
}
}
// Hash the entire buffer
let hashBuf = await crypto.subtle.digest('SHA-256', view);
let t2 = performance.now();
// Convert hash to hex string
let hashHex = Array.from(new Uint8Array(hashBuf))
.map(b => b.toString(16).padStart(2, '0')).join('');
// Store results
hashes.push(hashHex);
times.push(Math.round(t2 - t0));
// Clean up
buf = null; view = null;
}
return { hashes, times };
}
The client-side implementation is designed to be difficult to reverse-engineer. The obfuscated API responses, minimal logging, and anti-debugging measures prevent automated circumvention.
The Checkpoint system exposes two primary API endpoints:
Retrieves challenge parameters for a verification request:
GET /api/pow/challenge?id=REQUEST_ID
Response:
{
"a": "base64-encoded-challenge",
"b": "base64-encoded-salt",
"c": 4,
"d": "hex-encoded-pos-seed"
}
Accepts proof solutions and issues tokens when valid:
POST /api/pow/verify
Request:
{
"request_id": "unique-request-id",
"g": "nonce-solution",
"h": ["pos-hash1", "pos-hash2", "pos-hash3"],
"i": [time1, time2, time3]
}
Response:
{
"token": "base64-encoded-token",
"expires_at": "2025-04-17T18:57:48Z"
}
Backwards Compatibility: The older endpoint /api/verify is maintained for compatibility with existing clients.