Battletoads - Functional Testing Case Study
Battletoads case study hero banner used as the page header

Back to Manual Portfolio hub

🎮 Battletoads - Functional Testing (PC Game Pass)

Tested: 27 Oct–1 Nov 2025

🧾 About this work

  • Author: Kelina Cowell - Junior QA Tester (Games)
  • Context: Self-directed manual QA portfolio project
  • Timebox: 1 week (27 Oct–1 Nov 2025)
  • Platform: PC (Xbox Game Pass) • Windows 11
  • Focus: Functional testing of core gameplay flows and input ownership

Introduction

One-week functional test pass on Battletoads (PC Game Pass, Win11 @1080p/144Hz) focused on core gameplay flows and input ownership. I built a compact suite, captured short evidence clips, and logged four reproducible defects with clear Jira tickets.

Accessibility note: readability observations include a dyslexic tester perspective.

StudioPlatformScope
Dlala Studios / Rare PC (Game Pass) Gameplay logic • UI • Audio • Performance

🎯 Goal

Demonstrate core QA fundamentals by validating key gameplay flows and documenting reproducible defects with clear severity and repro steps.

🧭 Focus Areas

  • Gameplay logic
  • UI / navigation
  • Input & controller
  • Audio
  • Performance

📄 Deliverables

  • Test plan (Google Sheets)
  • Bug report (PDF)
  • Evidence videos (YouTube)
  • Jira workflow / board screenshots
  • STAR summary

📊 Metrics

MetricValue
Total Bugs Logged4
High4
Medium0
Low0
Test Runs Executed18
Repro Consistency100% (across 4 issues)

⭐ STAR SUMMARY – Battletoads QA (PC Game Pass)

Situation: One-week functional test of Battletoads on Win11, Game Pass build 1.1F.42718, 1920×1080@144Hz, Diswoe X360 controller (Xbox-compatible mapping) + keyboard.

Task: Validate core gameplay logic, UI flow, input handling (keyboard/controller focus), audio cues, and basic performance.

Action: Built a test plan, executed the suite daily, captured repro video with Xbox Game Bar/OBS, and logged defects in Jira with clear titles, steps, and evidence.

Result: All four issues were fully reproducible, with clear evidence clips for each.


📖 Sources & oracles used

  • In-game control bindings and settings menus (PC Game Pass build)
  • Observed gameplay behaviour during live test runs
  • Xbox controller input conventions on Windows
  • Game UI expectations based on comparable titles in the same genre
  • Accessibility heuristics for readability and focus (non-colour cues, spatial grouping)

📚 JIRA Courses & Application

Before this project I’d only raised two bug tickets during a Game Academy Bootcamp (Sep 2025 cohort), so I completed two beginner Jira courses to get up to speed. They gave me the foundations to set up a clean board, define issue types and workflows, and attach clear evidence so every ticket was self-contained and easy to review.

Practice in this project

🎓 Certificates

Certificate Provider Issued Evidence
Introduction to JIRA Simplilearn 2025 Certificate evidence for Introduction to Jira (Simplilearn), supporting Jira workflow usage in this project
Get Started with JIRA Coursera 2025 Certificate evidence for Get Started with Jira (Coursera), supporting Jira board setup and workflow use in this project

📷 Evidence & Media

These links are the complete artefacts for this project. They contain:

TypeFile / Link
QA Workbook (Google Sheets) Open Workbook
QA Workbook (PDF Export) Battletoads_QA_Functional_TestPlan_PCGamePass_Kelina_Cowell_PORTFOLIO.pdf

🧪 Core Project Findings - Test Cases & Bugs

Executed test cases and logged 4 High-severity input-ownership defects, all consistently reproducible. Evidence below: Jira board, verified thumbnails, and bug table with video links.

📁 Jira Board Screenshot - Overview

Jira board overview showing Battletoads QA tickets across To Do, Blocked, In Progress, and Verified for traceability of findings

🗂️ Jira Board — Verified Screenshots (thumbnails)

Jira board filtered to Verified issues showing resolved/verified input-ownership defects with evidence attachments 1 Jira board filtered to Verified issues showing resolved/verified input-ownership defects with evidence attachments 2 Jira board filtered to Verified issues showing resolved/verified input-ownership defects with evidence attachments 3 Jira board filtered to Verified issues showing resolved/verified input-ownership defects with evidence attachments 4

Click any thumbnail to view the full-size image.

🐞 Bugs - Summary + Videos

IDTitleSevReproVideo
01[PC][UI][Pause] Esc key does not open Pause menu when controller is connectedHigh5/5Bug 01 video
02[PC][UI][Pause] Keyboard navigation ignored on Pause menu when controller is activeHigh5/5Bug 02 video
03[PC][Input][Pause] Inconsistent keyboard/controller hand-off on Pause menuHigh3/3Bug 03 video
04[PC][Input][Join] Keyboard Enter from Pause opens Join In and disables controller inputHigh3/3Bug 04 video
Show inline videos

If you’re viewing this on github.com, embeds may not display. Use the thumbnails/links above or open this page on the published site (GitHub Pages) to watch inline.

📈 Results

See Metrics above for the full table and run references.


🤝 Networking & Applied Insight

During this project I sought targeted guidance from QA leaders and put it into practice. Following advice from Radu Posoi (Founder, Alkotech Labs; ex-Ubisoft QA Lead), I compared same-developer and same-genre titles, capturing what happened and why it matters for the player. Donna Chin (QA Engineer, Peacock/NBCU) urged me to train my eye for accessibility issues in client UI. I incorporated both recommendations into the study criteria. I ran this comparative review in addition to the main Battletoads testing and converted the notes into measurable player-impact metrics - summarised in the table below - to prioritise issues and produce clearer bug reports.

➕ Add-on Study - Comparative Findings - First-Minute to Control, Pause → Back, HUD Readability

Source Key takeaway How I applied it in this project Evidence
Radu Posoi — Founder, Alkotech Labs (ex-Ubisoft QA Lead) Be deliberate — compare same-dev and same-genre titles, and write down what happened + why it matters for the player.
  • Scoped two comparators: Battletoads vs Disney Illusion Island (same dev) and TMNT: Shredder’s Revenge (same genre).
  • Measured onboarding speed (presses to first control) and Pause → Back behaviour consistency with “what/why” notes.
  • Converted results into a comparison table and player-impact metrics to prioritise Battletoads issues and sharpen repro steps.
LinkedIn message excerpt showing advice to compare same-developer and same-genre titles and record what happened and why it matters Click to enlarge
Donna Chin — QA Engineer, Peacock/NBCU Train your eye for accessibility in client UI — check readability, contrast, focus order, and non-colour cues.
  • Included HUD readability in the comparative criteria: legibility of health/energy/combo indicators, label clarity, reliance on colour-only cues, pop-up occlusion, and spatial grouping.
  • Captured what happened and why it matters notes for each HUD observation during combat.
  • Reflected those observations in the comparison table and summary metrics.
LinkedIn message excerpt showing advice to check accessibility signals like readability, contrast, and focus order in UI Click to enlarge

⭐ MICRO-STAR SUMMARY – Comparative Findings

Situation: Post-pass curiosity: how do similar titles handle first-minute flow, Pause→Back, and HUD readability?

Task: Benchmark Battletoads against Teenage Mutant Ninja Turtles: Shredder’s Revenge and Disney Illusion Island to highlight UX strengths/risks.

Action: Timed title→control presses; reviewed Pause→Back behaviour; captured HUD readability snapshots.

Result: Battletoads reached control fastest (4 presses) and resumed cleanly; biggest risk: HUD readability in combat.

📊 Summary Metrics (across all three tests)

Game Timestamp Area / Feature What happened Why it matters Evidence
Battletoads 0:07–1:12 (1:05) Title → New Game 4 presses to first control Smooth first minute; low friction Battletoads HUD during combat showing key indicators spread across corners, used to document readability risk while fighting
0:01–0:20 (0:19) Title → Continue 4 presses; intro dialogue skipped Faster return-to-play; risk of missed context Battletoads HUD during combat showing key indicators spread across corners, used to document readability risk while fighting 2
0:04–0:13 (0:09) Pause → Back Immediate control; no unintended actions Prevents stray menu inputs Battletoads HUD during combat showing key indicators spread across corners, used to document readability risk while fighting 3
HUD readability Info spread across corners; tracking health/rank/combo is difficult mid-fight (tester is dyslexic and dyscalculic) Accessibility/readability risk Battletoads HUD during combat showing key indicators spread across corners, used to document readability risk while fighting 4
Disney Illusion Island 0:09–3:11 (3:02) Title → Play 13 presses to first control Higher start friction than Battletoads (4) & TMNT (6) Disney Illusion Island onboarding prompt showing just-in-time jump guidance, used as a comparator for first-minute onboarding 1
Pause → Back Not observed in available public footage (longplay/stream review) No inference without evidence
3:11 Onboarding Just-in-time jump prompt Clear guidance for new players Disney Illusion Island onboarding prompt showing just-in-time jump guidance, used as a comparator for first-minute onboarding 2
Teenage Mutant Ninja Turtles: Shredder’s Revenge 0:01–3:55 (3:54) Title → Start 6 presses to first control Higher friction than Battletoads TMNT HUD showing small unlabeled bars and UI obstruction, used as a comparator for readability risk 1
0:06–0:16 (0:10) Pause → Back Immediate control; no unintended actions Consistent resume behaviour TMNT HUD showing small unlabeled bars and UI obstruction, used as a comparator for readability risk 2
HUD readability Tiny blue/green unlabeled bars; score pop-ups obstruct Readability risk in combat TMNT HUD showing small unlabeled bars and UI obstruction, used as a comparator for readability risk 3

Method (Disney Illusion Island rows): Not hands-on — I reviewed public longplay/stream footage. Where behaviour wasn’t visible, it’s recorded as Not observed in available footage rather than guessed.

🏁 Result & takeaway

Result: Battletoads reaches first control in 4 presses (Teenage Mutant Ninja Turtles: Shredder’s Revenge 6, Disney Illusion Island 13) and resumes cleanly from Pause.

Takeaway: Battletoads was fastest to control among the three; the primary risk is HUD readability during combat.


🧩 What I learned


🔚 Conclusion

Full functional pass complete on Battletoads (PC Game Pass). I validated core flows end-to-end, stress-checked stability/performance, and documented 4 input-ownership defects with 100% repro, each backed by short videos and tidy Jira cards.


👤 Author

Kelina Cowell - Junior QA Tester (Games). Focused on manual testing, evidence-led bug reporting, and player-impact analysis.

LinkedIn profile

Relevant training & certifications

  • Introduction to Jira - Simplilearn (2025)
  • Get Started with Jira - Coursera (2025)

❓ FAQ

What was the goal of this project?

To demonstrate a structured manual QA workflow, clear bug reporting, and evidence-led communication in a timeboxed solo project.

What exactly did you deliver?

A bug log with reproduction steps and expected vs actual results, supporting video/screenshot evidence, and a structured QA workbook showing coverage and outcomes.

How did you decide what to test first?

I started with start→first control and Pause/Resume flows, then prioritised keyboard↔controller hand-off scenarios once input ownership issues appeared.

Is this representative of how you would work in a team?

Yes. The workflow, artefacts, and Jira usage mirror how I would operate within a small QA team, with the difference that this project was executed solo.

How do you ensure your bugs are actionable for developers?

Each issue includes clear repro steps, expected vs actual behaviour, environment details, and evidence that shows the defect and context.

How do you handle duplicate findings or “same symptom, different cause” issues?

I group related observations, reference existing entries, and only create separate issues when the reproduction conditions or impact are clearly different.

What would you do next if this project continued?

Run a retest pass on logged issues, deepen functional coverage around input ownership scenarios, and create a small regression checklist for the highest-impact flows.

Up next: I’m moving on to an Exploratory & Edge-Case Testing project on Rebel Racing (Mobile) focused on device compatibility, input latency, UI scaling, and crash handling.



Email Me Connect on LinkedIn Back to Manual Portfolio hub


📎 Disclaimer

This is a personal, non-commercial portfolio for educational and recruitment purposes. I’m not affiliated with or endorsed by any game studios or publishers. All trademarks, logos, and game assets are the property of their respective owners. Any screenshots or short clips are included solely to document testing outcomes. If anything here needs to be removed or credited differently, please contact me and I’ll update it promptly.