API Documentation
All endpoints return JSON. Base URL: http://212.47.71.50:8092
GET
/api/odds
Get all matches with odds from all bookmakers
{
"matches": [
{
"home": "Team A",
"away": "Team B",
"league": "Premier League",
"sport": "Soccer",
"bookmaker": "betika",
"start_time": 1702483200,
"is_live": false,
"odds": {"home": 1.85, "draw": 3.40, "away": 4.20}
}
]
}
GET
/api/odds?bookmaker=betika
Filter matches by bookmaker (betika, superbet, winner)
GET
/monitor
Monitoring dashboard - system metrics and scraper status
GET
/api/odds?bookmaker=winner
Filter matches by bookmaker - Winner
curl -s "http://212.47.71.50:8092/api/odds?bookmaker=winner" | jq
GET
/api/stats
Get system statistics (CPU, RAM usage)
{"cpu": 15, "ram": 42}
GET
/health
Health check endpoint - returns Winner errors, RAM usage, PIDs, parsing status
{
"status": "OK",
"winner_errors": 0,
"ram_mb": 491,
"pids": 148,
"parsed_last_3min": true
}
Usage Examples
JavaScript/Node.js
// Fetch all matches
fetch('http://212.47.71.50:8092/api/odds')
.then(r => r.json())
.then(data => {
console.log('Total matches:', data.matches.length);
data.matches.forEach(m => {
console.log(`${m.home} vs ${m.away} - ${m.bookmaker}`);
});
});
// Fetch only Superbet odds
fetch('http://212.47.71.50:8092/api/odds?bookmaker=superbet')
.then(r => r.json())
.then(data => console.log(data));
Python
import requests
# Get all matches
response = requests.get('http://212.47.71.50:8092/api/odds')
data = response.json()
for match in data['matches']:
print(f"{match['home']} vs {match['away']}")
print(f" Odds: {match['odds']}")
print(f" Bookmaker: {match['bookmaker']}")
cURL
# Get all matches
curl -s http://212.47.71.50:8092/api/odds | jq
# Get only soccer matches
curl -s "http://212.47.71.50:8092/api/odds?sport=Soccer" | jq
# Get scraper status
curl -s http://212.47.71.50:8092/api/scrapers | jq
Export Data
0
Total Matches
0
Betika
0
Superbet
0
Winner
Download Options
Export current odds data in various formats
Live JSON Preview
Loading...
Live Logs
System Statistics
--%
CPU Usage
--MB
RAM Usage
--
Uptime
--
Requests/min
Scraper Performance
| Scraper | Status | Matches | Success | Errors | Interval | Last Duration | Next In |
|---|
Performance Chart
🚦 Tests / Validation (Release Gate)
Verdictul de release este determinat automat: ✅ READY doar dacă toate testele trec și nu există erori JS / 404 / 500.
IDLE
Console errors (5 min): 0
Network 404/500 (5 min): 0
1) Capabilities Check
Not checked.
Endpoints
Filters
2) Run Full Self-Test (capabilities-first)
Total
0
PASS
0
FAIL
0
Last run
-
Definition of Done: All tests PASS + Console 0 errors + Network 0×(404/500) in last 60s idle.
3) Console & Network Guard
- Interceptează
- Interceptează fetch 404/500 (wrapper peste
- Dacă apare orice eroare în ultimele 5 minute → verdict automat NOT READY
window.onerror și unhandledrejection- Interceptează fetch 404/500 (wrapper peste
window.fetch)- Dacă apare orice eroare în ultimele 5 minute → verdict automat NOT READY
Configuration
API Access
Generate API keys for third-party access
Scraper Settings
| Scrape Interval | seconds |
| Request Timeout | seconds |
| Max Retries |
Enabled Bookmakers
🛡️ Live Monitoring Dashboard
Real-time system health and process monitoring
Requests/min
-
Events sent/min
-
Active clients
-
⚕️ Health Status
Loading...
Winner Errors: - (Δ -/min)
RAM: - MB (Δ - MB)
PIDS: - (Δ -)
Parsed Last 3min: -
🔐 Process Protection
Main Scraper PID: -
Child Scraper PIDs: -
Main Count: -
Children Count: -
Zombies: -
Watchdog Lock: -
Scraper Lock: -
Verdict: -
ℹ️ System Info
Build Time: -
Uptime: -
Camoufox Container: -
Last Updated: -
🧪 API Testing
Click a test button to run...
👥 Live Clients
No active clients
📋 Live Logs (Last 20 lines)
Loading logs...