Phase 1: Do the Tests Actually Work?
Rather than simply reviewing the test code, we used fault injection testing — a technique borrowed from reliability engineering. We systematically inserted 10 different types of programming mistakes into the custom modules and theme, then ran the Playwright test suite against each one.
The fault types were deliberately varied: PHP exceptions in block plugins, JavaScript syntax errors, broken CSS selectors, missing asset files, renamed HTML template elements, removed form classes.
Key Finding: 5 of 10 injected faults went completely undetected. The test suite was blind to all JavaScript errors — a broken global.js that killed every Drupal behaviour on every page produced zero test failures.
Fault Injection Results
| Fault Type | Module | Detected? |
|---|---|---|
| HTML element renamed (header) | Theme template | Yes — 16 tests failed |
| JS syntax error | evo_generic | No |
| JS logic error (wrong selector) | evo_generic | No |
| PHP exception in block | hero_banner | Yes — HTTP 500 |
| JS fatal error (throw) | evo_base_theme | No |
| Missing JS asset file | evo_commerce | No |
| CSS class renamed | Theme template | Yes — 17 tests failed |
| Footer element removed | Theme template | Yes — 19 tests failed |
| AJAX command broken | evo_ajax_add_to_cart | No |
| Form class removed (PHP) | Theme .theme file | Yes — 1 test failed |
Closing the Gaps
We wrote four new tests and a browser error collection helper:
- A pageerror listener that captures uncaught JS exceptions on every page load
- A failed asset detector that flags 4xx/5xx responses for JS and CSS files
- An email validation behaviour test that checks the actual CSS red border, not just form HTML
- Dedicated JS error tests for the homepage and contact form
Detection rate went from 50% to 70%. The remaining three gaps require authenticated user flows that are outside the current test scope.
Phase 2: Performance Optimisation
A Lighthouse audit scored the site at 60/100 (mobile) and 87/100 (desktop). Server response time was excellent at 20ms, but JavaScript execution and render-blocking resources were dragging down the mobile score.
The deeper story was in the custom PHP code. A line-by-line review of every custom module and theme file revealed issues that had been accumulating for years:
Cache-Killing Code
Two preprocess hooks were setting max-age = 0 on blocks present on every page — the social links footer block and the cart block. This silently disabled Drupal’s Internal Page Cache site-wide. We replaced these with proper cache contexts and tags, restoring full page caching.
N+1 Query Patterns
A sponsor block was calling referencedEntities() inside a loop for every partnership taxonomy term — 10 partnership levels meant the same query ran 10 times. Award winner pages loaded partner nodes one-by-one. Slider pages were making 20+ individual entity loads for 5 images.
All replaced with batch operations: Node::loadMultiple(), Paragraph::loadMultiple(), and shared lookups moved outside loops.
Uncached Configuration
16 ConfigPages::config() calls per page with no static caching — each one hitting the database. We wrapped these in a drupal_static() helper function.
Results
| Metric | Mobile | Desktop |
|---|---|---|
| First Contentful Paint | 3.3s | 0.8s |
| Largest Contentful Paint | 3.5s | 1.9s |
| Total Blocking Time | 720ms | 0ms |
| Cumulative Layout Shift | 0.007 | 0.006 |
| Server Response (TTFB) | 20ms | — |
Phase 3: SEO & AI Readiness
The SEO audit uncovered issues that had been silently undermining search performance:
- A hardcoded empty
<meta description>in the base theme template was overriding the Metatag module’s output on every page - The Open Graph module was enabled but had zero configuration — no OG tags were being output
- No Twitter Card tags were configured
- No structured data (JSON-LD / schema.org) existed anywhere
- No Sitemap directive in robots.txt
- Duplicate meta tags in the HTML template conflicting with Drupal’s output
All fixed: Open Graph and Twitter Card defaults configured for all content types, JSON-LD Organization schema added, preconnect hints for CDN resources, and the empty description removed.
AI Crawler Support
As AI-powered search becomes a significant traffic source, we proactively added:
- llms.txt and llms-full.txt — the emerging standard for AI model discovery, describing the company, services, and key content
- Robots.txt directives allowing GPTBot, Google-Extended, ChatGPT-User, Claude-Web, Applebot-Extended, and PerplexityBot while blocking low-value scrapers (Bytespider, CCBot)
The Jira Integration
Every phase followed the same pattern: Claude Code posted the prompt and plan as a formatted Jira comment, executed the work, then posted the results — complete with colour-coded tables showing pass/fail status, metric values, and file references.
The Jira API calls went through the site’s existing custom Drupal module. Claude discovered the module by reading the codebase, learned its API, and used it throughout. No external tools, no manual copy-paste.
The ticket now serves as a complete audit trail: what was asked, what was planned, what was found, and what was fixed. That’s documentation with lasting value.
Summary
| Area | Before | After |
|---|---|---|
| Test fault detection | 5/10 (50%) | 7/10 (70%) |
| Playwright tests | 48 | 52 |
| Page cache | Disabled by cache kills | Restored with proper contexts |
| DB queries per page | ~30–40 | ~10–15 |
| Homepage meta description | Empty | Configured |
| Open Graph tags | None | Full coverage |
| Structured data | None | JSON-LD Organization |
| AI crawler support | None | llms.txt + robot directives |
What’s Next
The site is now being scanned with Siteimprove for accessibility compliance and pentest-tools for security vulnerabilities. Findings from those scans will be addressed as part of the same tracked workflow, maintaining the complete audit trail on the Jira ticket.
