Welcome to my Games QA Portfolio
Nine timeboxed case studies across PC, mobile, and VR, covering functional, exploratory, and regression testing, plus VR comfort and accessibility, XR learning and recovery workflows, cross-platform input/controller parity, narrative/localisation QA, and early automation.
This portfolio follows a nine-project roadmap I designed to mirror studio workflows. Each project includes a clear goal, a practical QA workbook, Jira issues, evidence clips and screenshots, and a concise STAR summary, focusing on reproducible bugs and readable documentation.
The focus throughout is on practical judgement: scoping under time pressure, prioritising risk, and communicating findings clearly so they’re immediately actionable for developers.
Game Testing Portfolio Lineup
1: Functional Testing - Battletoads (PC)
Short functional test pass on Battletoads (PC Game Pass) to validate the core loop and surface repeatable bugs, with extra attention on keyboard to controller hand-off.
- Goal: Practise structured functional QA by checking core gameplay, menus and local co-op, and catching clear, reproducible defects.
- Focus: Pause / Join-In / Resume flows, controller ownership and hand-off, local co-op edge cases, menu navigation, on-screen prompts, and basic audio/performance sanity.
- Interesting result: Overall build felt stable but surfaced four high-impact input and hand-off defects, all 100% reproducible with clear steps and video.
- Evidence: Google Sheets QA workbook (cases, notes and PDF export), Jira issue tracking, and Xbox Game Bar plus OBS repro clips uploaded to YouTube.
- Why this game? A fast, recognisable brawler with drop-in local co-op, perfect for testing input ownership, pause behaviour and join-in flows under pressure.
2: Charter-based Exploratory & Edge-Case Testing - Rebel Racing (Mobile)
One-week exploratory / edge-case pass on a live F2P mobile racer, scoped to a single Android device.
- Goal: Practise charter-based mobile QA with daily golden-path smoke tests plus targeted edge-case sessions.
- Focus: Interruptions and recovery, LTE vs Wi-Fi, UI scaling and readability, input responsiveness, performance and device feel, and basic network / live surfaces (Store and Events).
- Interesting result: Found a post-race rewards soft lock after an OS alarm plus app close, plus smaller risks around LTE load delays and HUD readability.
- Evidence: Moto g54 recordings (scrcpy/OBS) and a Google Sheets workbook with charters, bug log and STAR summary.
- Why this game? A busy, live-service mobile racer with timers, pop-ups and network calls, ideal for practising interruption handling and real-world device edge cases.
3: Regression Testing - Sworn (PC Game Pass)
One-week change-driven regression pass on a recent Sworn patch, scoped to golden-path stability and patch-note fixes.
- Goal: Practise disciplined regression work by validating fixes, checking must-not-break systems, and catching side effects early.
- Focus: Save and load integrity, session start and quit flows, stamina and quest behaviour after fixes, and UI readability around low-health and stats.
- Interesting result: Confirmed a key fix held, resurfaced a persistent death-flow issue, and found minor UI feedback risks during core gameplay loops.
- Evidence: Regression matrix, Jira bug log, OBS recordings for before and after checks, and a STAR summary for key defects.
- Why this game? Sworn is patched often and has many interconnected systems, making it ideal for practising real change-driven regression instead of static case execution.
4: VR Comfort and Accessibility - Shadow Point (Quest 3)
One-week, charter-driven VR comfort and accessibility pass on Shadow Point (Quest 3 standalone, build 1.4), scoped to seated play in the tutorial and early chapters.
- Goal: Practise structured VR comfort and accessibility testing using established XR guidance, and document issues with clear evidence and traceability.
- Focus: Camera behaviour and horizon stability, locomotion and turning comfort, seated reach and motor strain risk, subtitle legibility and occlusion, non-audio cue redundancy at volume 0, and early-room cognitive barriers.
- Interesting result: Most issues clustered around subtitle occlusion and behaviour, plus confirmation gaps when audio cues were removed, showing how quickly comprehension can break in headset.
- Evidence: Charter Matrix, Session Log (S-001 to S-020), Bug Log (11 issues), STAR summary, glossary, and linked media evidence.
- Why this game? A slow paced VR puzzler where comfort surfaces, text distance, and interaction clarity show up clearly without fast combat noise.
Read the Shadow Point VR comfort and accessibility case study →
5: XR Learning, Recovery & Accessibility - Browser-based 3D Game Design Tool
One-week solo case study using risk-based manual testing to evaluate first-time learner workflows in a browser-based 3D creation tool, covering onboarding, object manipulation, navigation clarity, save and recovery behaviour, and core accessibility checks.
- Goal: Validate that a first-time learner can start, follow instructions, complete core creation tasks, and recover from mistakes without losing work, using clear evidence and actionable reporting.
- Testing scope: Onboarding and “what do I do now?”, learning flow and cognitive load, object placement and transform, navigation and UI context, save behaviour and loss prevention, undo/redo expectations, and basic accessibility.
- Deliverables: Charter-based session log and evidence, issue list with reproducible bug reports, short UX and accessibility findings with practical recommendations, and a summary grouped by onboarding, editor usability, recovery and accessibility.
- Why this project? Shows how QA skills apply to educational creator tools, where clarity, recovery and accessibility matter as much as feature correctness.
- Note: Product name kept anonymous by agreement. Findings are shared privately, and a named public case study will only be published with the team’s approval.
Case study coming soon.
6: Multi-Input / Controller Parity QA - Recompile (PC)
Planned controller parity case study on Recompile (PC Game Pass), focused on input feel, mapping clarity and basic hot-swap handling.
- Goal: Practise structured controller QA by checking responsiveness, mapping accuracy and parity across two Xbox-style controllers.
- Focus: Input latency and feel, dead zones and drift, button and axis parity, Xbox prompts and icons, hot-swap behaviour and simple rebinding checks.
- Planned deliverables: Small input-response matrix, short latency notes, prompt and binding screenshots, and a STAR-style summary of key findings.
- Why this game? Recompile’s fast movement and frequent mode switches make it a good stress test for subtle input mismatches, latency spikes and inconsistent prompts.
Case study coming soon.
7: Narrative / Localisation QA - Oxenfree (Mobile)
Planned narrative and localisation QA pass on Oxenfree (Netflix Games, Android), focused on subtitle timing, tone and readability.
- Goal: Practise LQA-style testing by checking subtitle timing, readability and tone consistency, and logging issues in a localisation-friendly format.
- Focus: English subtitles and UI text only, including pacing vs VO, line breaks and overflow, speaker attribution, sensitive phrasing and UI legibility on a mobile screen.
- Planned deliverables: Short subtitle and tone report, simple readability checklist, and an LQA-style issue log with full lines, context and screenshots or clips.
- Why this game? Oxenfree’s walk-and-talk dialogue, radio tuning and branching story make timing, tone and readability critical, so it is ideal for practising narrative and localisation QA on mobile.
Case study coming soon.
8: Automation Testing - PowerWash Simulator 2 (PC)
Planned first automation case study on PowerWash Simulator 2, focused on simple smoke scripts and fast re-checks of core flows.
- Goal: Show early automation value by building small, repeatable checks that quickly re-validate core flows and settings after changes.
- Focus: Launch to menu to job select smoke path, settings persistence, basic save and load behaviour, and simple scripted input sequences for sanity checks.
- Planned deliverables: Lightweight smoke scripts, a small regression checklist, short clips of automated runs and a few comparison screenshots.
- Why this game? Clear, repeatable cleaning loops and stable menus make PowerWash Simulator 2 a low-risk candidate for early automation and fast regression passes.
Case study coming soon.
9: Solo QA Workflow (Capstone) - [TBD Indie PC Game]
Planned portfolio capstone, acting as a solo QA workflow on a small indie PC title and combining everything from Projects 1–8.
- Goal: Prove end-to-end QA capability as the only tester, from smoke tests and exploratory charters through to regression, accessibility checks and a final recommendation.
- Focus: Core loop stability, menus and settings, save and load behaviour, early to mid progression, basic accessibility and UX, input parity (KBM and controller) and light performance sanity.
- Planned deliverables: Small set of charters, bug log with evidence, simple regression checklist, brief process write-up and a release-style QA recommendation.
- Why this project? Designed as the final piece of the portfolio, showing how I would work as the sole QA on a small indie team and pull together functional, exploratory, regression, accessibility and input checks into one coherent workflow.
Case study coming soon.
✅ Coverage Map
Coverage map of the portfolio projects, showing platform, QA type, and testing focus at a glance.
| Project | QA Type | Platform | Focus | Status |
|---|---|---|---|---|
| Battletoads | Functional | PC (Game Pass) | Core flows · Input ownership · UI/menus | Live |
| Rebel Racing | Exploratory & Edge-Case | Mobile | Scaling · Touch · Interruptions · Network | Live |
| Sworn | Regression | PC (Game Pass) | Save/load · Input mapping · UI scaling | Live |
| Shadow Point | VR Comfort & Accessibility | Quest 3 | Comfort · Tracking · Subtitle legibility | Live |
| 3D Game Creation Tool | XR Learning, Recovery & Accessibility | Browser | Onboarding · Clarity for non-gamers · Cognitive load · Accessibility · Comfort | In Progress |
| Recompile | Cross-Platform Input | PC / Controller | Mapping parity · Latency · Prompts | Coming soon |
| Oxenfree | Narrative / Localisation | Mobile | Subtitles · Timing · Readability | Coming soon |
| PowerWash Simulator 2 | Automation | PC (Game Pass) | Smoke scripts · Regression checks | Coming soon |
| [TBD Indie PC Game] | Solo QA Workflow | PC | Process doc · Charters · Bug log · Regression checks | Coming soon |
FAQ
What will I find on each project page?
A short project summary, scope and timebox, test approach, linked artefacts (workbook/logs), and supporting evidence. The goal is that you can review the work quickly without guessing what was done.
Which case study should I read first?
If you want a quick overview, start with Battletoads (functional) or Sworn (regression). If you're hiring for VR/XR work, start with Shadow Point or the XR learning project.
What level are these projects aimed at?
Entry-level to junior QA roles, with documentation quality aimed at being readable and actionable for developers and QA leads.
Are these self-directed projects or professional studio work?
These are self-directed portfolio projects using real products and builds. They are designed to reflect production QA habits: structured scope, clear reporting, and traceable outputs.
How do you decide what to test in a one-week timebox?
I prioritise high-impact user flows and change-driven risk, then use targeted charters to explore likely breakpoints and edge cases. The focus is depth in the riskiest areas rather than broad, shallow coverage.
What should I look at first if I have limited time?
Start with the most recent project, then skim the scope/timebox and the artefacts summary. If the approach fits what you need, jump to the findings and evidence links for a fast validation pass.
Is this portfolio focused on games QA only?
It is primarily games QA, but it also includes software-style testing work where relevant. The documentation and risk-based approach are intended to be transferable across both.
🛠️ Tools Used
QA tools used for manual game testing, bug tracking, documentation, evidence capture, and automation basics.
- Bug tracking Jira
- Version control GitHub
- Test documentation Google Sheets (QA workbooks)
- PC video capture OBS Studio
- PC video capture Xbox Game Bar
- Evidence hosting YouTube
- Mobile screen capture scrcpy (Android)
- VR screen capture Meta Quest built-in recording
- Automation basics AutoHotkey
- Scripting basics Python
🎯 Skills
Manual QA across PC, mobile, and VR, plus early automation. Clear bug reporting, evidence capture, and structured documentation.
- Functional testing
- Exploratory testing (charter-based)
- Regression testing
- Smoke testing
- Risk-based test design
- Bug reporting (repro steps, expected vs actual)
- Severity and priority triage
- Test documentation (workbooks, matrices, session logs)
- Evidence capture (clips and screenshots)
- Mobile interruptions and recovery
- VR comfort and accessibility testing
- Narrative and localisation QA basics
- Input and controller testing
- Automation basics (AutoHotkey, Python)
Summary
I’m a junior Game QA tester focused on clear, evidence-backed testing. This Games QA portfolio shows manual testing across PC, mobile, and VR, supported by tidy documentation, reproducible bugs, and short clips that make issues easy to validate. My aim is simple: help teams ship smoother, clearer experiences for players.Email Me Connect on LinkedIn Back to Homepage