🎯 Goal

  • Get something real and tangible shipped today (even if I had to stop abruptly).
  • Build a clean detection-engineering repo (detections) with SOC-style structure and drafts.
  • Fix my blog setup issues (icons, Git repo hygiene).
  • Start actually understanding what I’m building (not just “copy-pasting”).

âś… What I Did

1) Fixed my blog repo layout + icons (Jekyll)

  • Moved my blog repo JuriBuora.github.io into ~/Projects/ (kept Git history intact).
  • Ran Jekyll locally and saw errors for missing:
    • favicon.ico
    • apple-touch-icon.png
    • apple-touch-icon-precomposed.png
  • Tried installing Pillow with pip and hit the macOS / Homebrew “externally-managed environment” block (PEP 668).
  • Solved it properly by using a local virtual environment:
    • python3 -m venv .venv
    • source .venv/bin/activate
    • pip install pillow
  • Generated a real favicon.ico (16 + 32) from a PNG using Pillow.
  • Added/committed the icon files and fixed Git push issues.

2) Solved Git issues cleanly (no panic mode)

  • Hit: push rejected (fetch first) because remote had commits not present locally.
  • Fixed via git fetch + git pull --rebase + then push.
  • Cleaned up repo hygiene:
    • removed temporary working folders (_iconwork/)
    • ensured .venv/ and _iconwork/ are ignored

3) Built my Detection Engineering repo (detections)

  • Created the repo and cloned it locally under ~/Projects/.
  • Created a SOC-style structure:
    • rules/kql/ for KQL detections
    • rules/sigma/ for Sigma (later)
    • docs/ for human-readable writeups
    • tests/ scaffolding (future)
  • Created a reusable template: docs/detection_template.md.

4) Removed macOS junk from Git

  • Discovered .DS_Store had made it into the repo (trash).
  • Removed tracked .DS_Store via git rm --cached and added proper ignores.
  • Ensured the repo returns clean on git status.

5) Drafted 10 detections (t0001–t0010)

Created draft detection files with consistent naming:

  • t0001__ssh-bruteforce
  • t0002__rdp-bruteforce
  • t0003__password-spray
  • t0004__suspicious-powershell
  • t0005__new-local-admin
  • t0006__persistence-service-or-task
  • t0007__impossible-travel-signin
  • t0008__mail-forwarding-rule
  • t0009__office-spawns-shell
  • t0010__possible-exfil-large-outbound

Each detection has:

  • a KQL rule file in rules/kql/
  • a matching operational writeup in docs/ (goal, data sources, false positives, tuning, triage, MITRE mapping)

6) Actually studied + understood File 1 and File 2

Instead of just shipping files, I switched into “learn what this means” mode:

  • File 1 (SSH brute force):
    • learned what syslog is, what connectors are, why time-binning matters
    • improved the detection by extracting usernames properly
    • created an Obsidian-ready summary note
  • File 2 (RDP brute force):
    • learned Windows basics: EventID 4625 (failed logon), LogonType 10 (RemoteInteractive)
    • learned why external vs internal IP changes triage severity
    • created an Obsidian-ready summary note

7) Built quick cheat sheets for fast learning

  • Linux filesystem quick map + flashcards:
    • /etc, /var/log, /tmp, /proc, etc.
  • Windows SOC L1 cheat sheets:
    • Top event IDs (4624/4625/4688/4720/4732/1102/etc.)
    • Logon types (2, 3, 10, etc.)
  • Clarified what SMB is (network logon type context).

đź”— Key Cybersecurity Connections

  • Detection engineering isn’t just “write a query”
    • You need data sources, thresholds, false positives, tuning, and triage.
  • Time windows are core
    • bin(TimeGenerated, 5m) time-bucketing is the backbone of brute-force style detections.
  • Correlation increases confidence
    • failures → then a success (possible brute force succeeded)
  • Linux vs Windows logging
    • Linux evidence often originates in /var/log/auth.log and ships via syslog/collectors to a SIEM
    • Windows evidence uses stable event IDs (4625, 4624) and logon types (10 for RDP)

⚠️ Challenges

  • macOS Python packaging restrictions (PEP 668) blocked pip install.
    • Fixed by using a local venv instead of breaking system Python.
  • Git push rejection due to remote being ahead.
    • Solved cleanly with rebase workflow.
  • .DS_Store infected the repo.
    • Removed and prevented permanently with .gitignore.
  • Realized I could ship outputs faster than I could explain them.
    • So I started converting output into understanding + notes.

đź§  What I Learned

  • A daemon is a background service (e.g., sshd) that runs continuously.
  • In KQL:
    • | is a pipeline (filtering/transform chain)
    • bin() creates time buckets
    • extract() uses regex (regular expression) to pull structured fields from log text
  • Good detections have two layers:
    • rules/ = executable detection logic
    • docs/ = operational understanding (triage, tuning, escalation)

⏭️ Next Steps

  • Finish learning remaining detections one by one (File 3 → File 10) using a repeatable workflow: 1) quick test questions 2) line-by-line explanation 3) Obsidian-ready summary file
  • Improve drafts with better field extraction and realistic tables depending on SIEM sources.
  • Add a small “SOC case study” wrapper showing detection → triage → tuning iteration.

đź’­ Reflection

The repo is now tangible, structured, and versioned — but the real win was switching to:

“Explain it like I’m going to be interviewed on it.”


âś… Lessons Learned

What worked

  • Building a real repo with real artifacts created momentum.
  • Fixing errors properly (venv, rebase) prevented future chaos.
  • Testing myself with questions forced real comprehension.

What broke

  • .DS_Store clutter and push rejection slowed me down.
  • Knowledge gaps made me feel like I was “doing without knowing.”

Why it broke

  • macOS tooling defaults + Homebrew protections
  • normal Git workflow mismatch (remote ahead)
  • moving faster than understanding

Fix / takeaway

  • Keep shipping, but always convert shipping into knowledge: build → test → explain → document.