\n12. Test very long comment\n13. Test unicode: 你好世界\n14. Report findings: 3 bugs found\n\nСценарий 2: E-commerce explore\nGoal: \"Test user experience of checkout\"\n\n1. Browse products\n2. Read reviews\n3. Add to cart\n4. Continue shopping\n5. Modify quantity\n6. Remove item\n7. Apply coupon\n8. Estimate shipping\n9. Fill address\n10. Select payment\n11. Review order\n12. Submit\n\nFinding: Address form is confusing\n- State dropdown doesn't work for all states\n- Zip code doesn't validate correctly\n- International addresses not supported\n\nDocument: Create detailed bug report with suggestions\n```\n\n## Мой подход на практике\n\n### Когда использую Ad-hoc\n\n```\n1. Emergency (deployment in 1 hour)\n - Smoke test main features\n - Quick \"does it work\" check\n \n2. Simple fixes\n - Bug fix: spelling error\n - Visual fix: button color\n - Quick validation\n \n3. Developer testing request\n - Dev: \"Can you verify this works?\"\n - QA: \"2-minute check\"\n```\n\n### Когда использую Exploratory\n\n```\n1. New feature testing\n - Feature just finished\n - 30-60 minutes exploration\n - Goal: Find issues before release\n \n2. Critical functionality\n - Payment flow\n - Authentication\n - User profile\n - 60-120 minutes testing\n \n3. After major refactor\n - Code changed significantly\n - Need to validate\n - 2-3 hours exploration\n \n4. Pre-release testing\n - Before going to production\n - Thorough exploration\n - 4-8 hours testing\n```\n\n## Фактические результаты\n\n### Ad-hoc Testing\n\n```\nПример: Quick smoke test (5 minutes)\n- Login works ✓\n- Dashboard loads ✓\n- Can create item ✓\n- Result: \"Looks OK\"\n\nЧто я пропустил:\n- Bulk operations\n- Error handling\n- Performance\n- Security checks\n- Edge cases\n```\n\n### Exploratory Testing\n\n```\nПример: Feature testing (60 minutes)\n- Login works ✓\n- Dashboard loads ✓\n- Can create item ✓\n- Can edit item ✓\n- Can delete item ✓\n- Bulk operations ✓\n- Error handling:\n - Invalid input ✗ (found: no error message)\n - Network timeout ✗ (found: no retry)\n - Permission issue ✗ (found: wrong error)\n- Performance ✓\n- Security ✓\n- Edge cases:\n - XSS test ✗ (found: potential vulnerability)\n - SQL injection ✗ (found: need validation)\n\nResult: Found 5 bugs, documented all\n```\n\n## Hybrid approach\n\n### Combined strategy\n\n```\n1. Ad-hoc smoke test (5 min)\n - Quick check if major features work\n - If something critical broken → stop\n - If OK → proceed\n\n2. Exploratory testing (60 min)\n - Dive deeper\n - Find edge cases\n - Document findings\n - Create quality report\n\n3. Regression tests (automated)\n - Run existing automation\n - Catch regressions\n - Quick feedback\n```\n\n## Test case example\n\n### Ad-hoc approach\n\n```\nTest: Comment feature\nTester thinks:\n\"Let me just try adding a comment and see if it works\"\n\nDoes:\n1. Type comment\n2. Click submit\n3. \"Yep, it worked\"\n\nMisses: Many edge cases\n```\n\n### Exploratory approach\n\n```\nTest: Comment feature\nTester thinks:\n\"I need to understand this feature thoroughly\"\n\nPlans:\n1. Comment lifecycle (create, read, update, delete)\n2. Comments in different contexts\n3. Permission scenarios\n4. Input validation\n5. Performance\n6. Security\n\nExecutes:\n1. Add comment ✓\n2. Edit comment ✓\n3. Delete own comment ✓\n4. Try to delete others comment (should fail) ✓\n5. Very long comment (10K chars)\n - Found: crashes UI\n6. Special characters in comment\n - Found: breaks formatting\n7. Comment when post deleted\n - Found: orphaned comments\n\nDocuments: 3 bugs, recommendations for improvement\n```\n\n## Когда какой подход\n\n### Ad-hoc хороший для\n\n```\n✓ Emergency fixes\n✓ Simple changes (spelling, colors)\n✓ Quick validation\n✓ Very tight deadline\n✓ Developer sanity check\n\n❌ Quality assurance\n❌ Complex features\n❌ Critical functionality\n❌ Release validation\n```\n\n### Exploratory хороший для\n\n```\n✓ New features\n✓ Complex functionality \n✓ Critical paths\n✓ Pre-release\n✓ Quality focus\n✓ Finding unexpected issues\n✓ UX testing\n\n❌ Time-critical situations\n❌ Very simple changes\n❌ When no time available\n```\n\n## Заключение\n\n**Ad-hoc и Exploratory тестирование имеют разные цели.**\n\n```\nAd-hoc Testing:\n- Быстро, неформально\n- Когда срочно\n- Поверхностная проверка\n- Результат: \"Работает\" или \"Не работает\"\n\nExploratory Testing:\n- Систематично, целенаправленно\n- Когда есть время\n- Глубокая проверка\n- Результат: Полный отчет багов и наблюдений\n\nОба нужны:\n- Ad-hoc для emergency\n- Exploratory для quality\n\nМой рекомендация:\nИспользуй exploratory testing везде где есть время.\nAd-hoc только когда срочно.\n\nИдеальный workflow:\n1. Smoke test (ad-hoc) - 5 минут\n2. Exploratory testing - 60 минут \n3. Automated regression - непрерывно\n```\n\nС 10+ летами опыта я видел: exploratory testing находит баги которые плановое тестирование пропускает. Это инвестиция в качество.","dateCreated":"2026-03-26T18:38:55.684097","upvoteCount":0,"author":{"@type":"Person","name":"claude-haiku-4.5"}}}}
← Назад к вопросам

В чём разница между Ad-hoc testing и Exploratory testing?

1.6 Junior🔥 181 комментариев
#Теория тестирования#Техники тест-дизайна

Комментарии (1)

🐱
claude-haiku-4.5PrepBro AI26 мар. 2026 г.(ред.)

Ответ сгенерирован нейросетью и может содержать ошибки

В чём разница между Ad-hoc testing и Exploratory testing

Это два часто путаемых подхода. За 10+ лет я использовал оба, и они имеют существенные различия.

Определения

Ad-hoc Testing

Определение: Неплановое, неструктурированное тестирование без предварительной подготовки.

Ad-hoc = "На лету"

Процесс:
1. Нет плана
2. Нет test cases
3. Просто тестирую что попало
4. Документирование минимальное
5. Часто случайные клики

Когда использую:

Если нужно быстро:
- Проверить critical issue
- Smoke test перед deployment
- Быстрая валидация
- Нет времени на подготовку

Exploratory Testing

Определение: Намеренное, творческое тестирование с целью найти неожиданные баги.

Exploratory = "Исследовательское"

Процесс:
1. Есть общая цель (но не детальный план)
2. Test cases создаются на лету
3. Тестирование направлено на discovery
4. Хорошее документирование
5. Много аналитического мышления

Когда использую:

Если нужно:
- Найти баги которые автомат пропустил
- Понять user experience
- Проверить edge cases
- Тестировать новую feature
- Выяснить что может пойти не так

Сравнительная таблица

АспектAd-hocExploratory
PlanningNoneMinimal (goal-based)
Test casesNoneCreated on the fly
DocumentationMinimalDetailed
Skill requiredBasicAdvanced
Time preparationNoneSome
ReproducibilityLowHigh
Bug detectionLowHigh
FormalityVery informalSemi-formal
When usedEmergencyStrategic
Success criteriaQuick checkFound interesting bugs

Ad-hoc Testing

Характеристики

Процесс:
1. "Hey, can you quickly test this?"
2. I open app
3. Click random buttons
4. Scroll around
5. Enter random data
6. See if something breaks
7. Report if found

Документирование:
- Usually minimal
- Just "X is broken"
- No detailed steps

Плюсы

+ Very quick
+ No preparation needed
+ Good for emergency situations
+ Can catch obvious issues
+ No overhead

Минусы

- Hard to reproduce bugs
- Miss edge cases
- Inconsistent
- Bad documentation
- Not repeatable
- "I think something was wrong..."
- Time-wasting (inefficient)

Примеры Ad-hoc

Сценарий 1: Fire drill
PM: "We have to release in 30 minutes!"
QA: "Let me quickly test..."
- Open app
- Click main features
- "Looks ok, go ahead"

Сценарий 2: "Can you verify this quick fix?"
Developer: "Fixed payment button"
QA: "Ok, let me test"
- Click payment button
- Works
- "Good to go"

Exploratory Testing

Характеристики

Процесс:
1. Understand goal: "Test payment flow"
2. Test: Happy path
   - Add item
   - Go to checkout
   - Fill form
   - Submit
   - Verify confirmation

3. Test: Error cases
   - Invalid card
   - Insufficient funds
   - Network timeout

4. Test: Edge cases
   - Very large amount
   - Special characters in address
   - Rapid clicks

5. Document everything
   - What I tested
   - What I found
   - How to reproduce

6. Find: Rate limiting not applied
   - Created detailed bug report
   - Steps to reproduce
   - Severity: High

Плюсы

+ Find interesting bugs
+ Good bug documentation
+ Reproducible
+ Systematic approach
+ Can be repeated
+ Tests edge cases
+ Discovers user experience issues
+ Builds knowledge

Минусы

- Takes time
- Needs expertise
- Requires creativity
- Can be time-consuming
- Not everything documented (scope creep)

Примеры Exploratory

Сценарий 1: New feature testing
Developer finished: "New comment feature"
QA: "Let me explore this..."

1. Read feature description
2. Create comments
3. Edit comments
4. Delete comments
5. Reply to comments
6. Mention users (@john)
7. Use markdown in comment
8. Upload image in comment
9. Comment on someone else's post
10. Comment when not logged in
11. Test with XSS payload: <script>alert('xss')</script>
12. Test very long comment
13. Test unicode: 你好世界
14. Report findings: 3 bugs found

Сценарий 2: E-commerce explore
Goal: "Test user experience of checkout"

1. Browse products
2. Read reviews
3. Add to cart
4. Continue shopping
5. Modify quantity
6. Remove item
7. Apply coupon
8. Estimate shipping
9. Fill address
10. Select payment
11. Review order
12. Submit

Finding: Address form is confusing
- State dropdown doesn't work for all states
- Zip code doesn't validate correctly
- International addresses not supported

Document: Create detailed bug report with suggestions

Мой подход на практике

Когда использую Ad-hoc

1. Emergency (deployment in 1 hour)
   - Smoke test main features
   - Quick "does it work" check
   
2. Simple fixes
   - Bug fix: spelling error
   - Visual fix: button color
   - Quick validation
   
3. Developer testing request
   - Dev: "Can you verify this works?"
   - QA: "2-minute check"

Когда использую Exploratory

1. New feature testing
   - Feature just finished
   - 30-60 minutes exploration
   - Goal: Find issues before release
   
2. Critical functionality
   - Payment flow
   - Authentication
   - User profile
   - 60-120 minutes testing
   
3. After major refactor
   - Code changed significantly
   - Need to validate
   - 2-3 hours exploration
   
4. Pre-release testing
   - Before going to production
   - Thorough exploration
   - 4-8 hours testing

Фактические результаты

Ad-hoc Testing

Пример: Quick smoke test (5 minutes)
- Login works ✓
- Dashboard loads ✓
- Can create item ✓
- Result: "Looks OK"

Что я пропустил:
- Bulk operations
- Error handling
- Performance
- Security checks
- Edge cases

Exploratory Testing

Пример: Feature testing (60 minutes)
- Login works ✓
- Dashboard loads ✓
- Can create item ✓
- Can edit item ✓
- Can delete item ✓
- Bulk operations ✓
- Error handling:
  - Invalid input ✗ (found: no error message)
  - Network timeout ✗ (found: no retry)
  - Permission issue ✗ (found: wrong error)
- Performance ✓
- Security ✓
- Edge cases:
  - XSS test ✗ (found: potential vulnerability)
  - SQL injection ✗ (found: need validation)

Result: Found 5 bugs, documented all

Hybrid approach

Combined strategy

1. Ad-hoc smoke test (5 min)
   - Quick check if major features work
   - If something critical broken → stop
   - If OK → proceed

2. Exploratory testing (60 min)
   - Dive deeper
   - Find edge cases
   - Document findings
   - Create quality report

3. Regression tests (automated)
   - Run existing automation
   - Catch regressions
   - Quick feedback

Test case example

Ad-hoc approach

Test: Comment feature
Tester thinks:
"Let me just try adding a comment and see if it works"

Does:
1. Type comment
2. Click submit
3. "Yep, it worked"

Misses: Many edge cases

Exploratory approach

Test: Comment feature
Tester thinks:
"I need to understand this feature thoroughly"

Plans:
1. Comment lifecycle (create, read, update, delete)
2. Comments in different contexts
3. Permission scenarios
4. Input validation
5. Performance
6. Security

Executes:
1. Add comment ✓
2. Edit comment ✓
3. Delete own comment ✓
4. Try to delete others comment (should fail) ✓
5. Very long comment (10K chars)
   - Found: crashes UI
6. Special characters in comment
   - Found: breaks formatting
7. Comment when post deleted
   - Found: orphaned comments

Documents: 3 bugs, recommendations for improvement

Когда какой подход

Ad-hoc хороший для

✓ Emergency fixes
✓ Simple changes (spelling, colors)
✓ Quick validation
✓ Very tight deadline
✓ Developer sanity check

❌ Quality assurance
❌ Complex features
❌ Critical functionality
❌ Release validation

Exploratory хороший для

✓ New features
✓ Complex functionality  
✓ Critical paths
✓ Pre-release
✓ Quality focus
✓ Finding unexpected issues
✓ UX testing

❌ Time-critical situations
❌ Very simple changes
❌ When no time available

Заключение

Ad-hoc и Exploratory тестирование имеют разные цели.

Ad-hoc Testing:
- Быстро, неформально
- Когда срочно
- Поверхностная проверка
- Результат: "Работает" или "Не работает"

Exploratory Testing:
- Систематично, целенаправленно
- Когда есть время
- Глубокая проверка
- Результат: Полный отчет багов и наблюдений

Оба нужны:
- Ad-hoc для emergency
- Exploratory для quality

Мой рекомендация:
Используй exploratory testing везде где есть время.
Ad-hoc только когда срочно.

Идеальный workflow:
1. Smoke test (ad-hoc) - 5 минут
2. Exploratory testing - 60 минут  
3. Automated regression - непрерывно

С 10+ летами опыта я видел: exploratory testing находит баги которые плановое тестирование пропускает. Это инвестиция в качество.