New: Starlink & Alternative Broadband Dashboard
The Isle of Man spent £11.65 million on gap-funding Manx Telecom's FTTP fibre rollout. Nearly 48,400 premises are now passed, copper is being retired by 2029, and the government's broadband strategy looks — on paper — like a success story.
And yet: Starlink is growing.
We've added a new Starlink & Alternative Broadband page that tracks the rise of LEO satellite internet and other non-incumbent providers using CURA's quarterly market statistics. The page includes six interactive charts covering market share growth, estimated subscriber counts, incumbent duopoly erosion, technology splits, FTTP take-up resistance, and a pricing disruption analysis.
The £35 Problem
The most striking chart is the price comparison. When Starlink dropped its standard residential plan to £35/month for ~100 Mbps with no contract, it undercut every single Isle of Man fibre provider at the entry and standard tiers:
- Sure Fibre 100: £55/month (24-month contract)
- Manx Telecom Fibre 100: £60.40/month (24-month contract)
- Starlink Standard: £35/month (no contract, self-install)
That's a 42% saving, with the added benefit of no engineer visit, no 24-month lock-in, and a 15-minute self-install. For renters, rural residents, and people frustrated with incumbent pricing, the economics are compelling.
FTTP Take-up Has Stalled
Despite near-universal fibre coverage, FTTP take-up has plateaued at around 56%. That means 44% of premises passed by fibre are choosing not to take it up — some remaining on legacy copper, some choosing wireless alternatives, and some opting out entirely.
With the copper sunset scheduled for 2029, thousands of households will face a forced migration. That window represents the largest single market opportunity for alternative providers since their launch.
The new page sits alongside our existing Broadband & Telecoms, Broadband Speeds, and Price Comparison dashboards — together forming a comprehensive view of Isle of Man connectivity.
MCP Server Security Hardening
We reviewed our MCP server — the 81-tool API that powers AI assistant integration with Isle of Man open data — against the newly published OWASP Practical Guide for Secure MCP Server Development (v1.0, February 2026).
The audit found that many of the guide's highest-severity recommendations (OAuth, cryptographic tool manifests, containerised sandboxing) are designed for multi-tenant servers handling sensitive data. Our server is read-only public data with tools we wrote ourselves, so the threat model is different. But three areas needed attention.
SQL Injection: 25 Tools Parameterised
The most significant fix. Multiple search tools were constructing raw SQL with string interpolation and single-quote escaping — a classic injection vulnerability. Even though the data is public and read-only, we converted every affected tool (25 files, 60+ query instances) from:
WHERE manufacturer LIKE N'%${userInput}%'
to properly parameterised queries:
WHERE manufacturer LIKE @P1
-- with params: [`%${userInput}%`]
This uses positional parameter placeholders that are natively handled by the SQL Server driver, making injection impossible regardless of input content.
Rate Limiting: 100 Requests/Minute Per IP
We added an in-memory IP-based rate limiter to all MCP endpoints. The limiter uses a sliding window counter approach — 100 requests per minute per client IP, with HTTP 429 responses when exceeded. Stale entries are automatically cleaned up to prevent memory growth.
This protects the underlying Azure SQL database from being hammered by abusive clients while remaining generous enough for legitimate AI assistant usage.
Error Message Cleanup
We scrubbed error responses to remove internal details (table names, file paths, scraper commands) that were being returned to clients. Detailed error information is now only logged server-side.
What We Didn't Change (And Why)
The OWASP guide recommends OAuth 2.1 for all remote MCP servers. We're not adding authentication because:
- The data is public — the same information is freely available on our website
- The mission is openness — adding OAuth would make the API harder to use for exactly the audience we want: researchers, data journalists, and AI assistants
- There's nothing to protect — no writes, no mutations, no user data, no secrets accessible via tools
Similarly, we skipped cryptographic tool manifests and tool signing. These make sense for marketplace-style MCP servers that load third-party tools. We ship all 81 tools ourselves from a single codebase.
The OWASP guide is excellent and we'd recommend it to anyone building MCP servers that handle sensitive data or user authentication. For our use case — a read-only open data API — the right response was targeted hardening rather than full compliance.
Also This Week
- Vital Statistics heatmaps updated to a red-amber-green colour gradient
- Schools Cohort Analysis now includes sixth form Y12-to-Y13 attrition tracking
- Election candidate pipeline refactored to per-candidate override files, preventing cross-contamination when updating individual candidates
