📅 Day 30 – Streams, Exit Codes, and Bash Redirection
🎯 Goal
Develop a deeper operational understanding of how Linux commands communicate:
- success vs failure through exit codes
- output through standard streams
- how redirection controls where output goes
These concepts are foundational for scripting, log analysis, and automation workflows used in security operations.
🛠️ What I Did
Explored exit codes
Linux commands signal success or failure using exit statuses.
Simple test:
true
false
echo $?
Results:
true→ exit code 0false→ exit code 1
Convention:
0 = success
non-zero = failure
Exit codes allow scripts and automation tools to make decisions based on command outcomes.
Practiced pipelines
Basic pipeline example:
grep orange fruit.txt | wc -l
Behavior:
grepfilters matching lines- its stdout becomes the stdin of
wc -l wccounts the resulting lines
Important realization:
Pipelines pass stdout only, not error messages.
Investigated stdout vs stderr
Commands produce two main output streams:
| Stream | Meaning |
|---|---|
| stdout | normal program output |
| stderr | error messages |
Example:
ls /bad
Output goes to stderr because the path does not exist.
Practiced output redirection
Redirecting output to a file:
ls > files.txt
Redirecting errors:
ls /bad 2> errors.txt
Capturing both streams:
ls /bad > output.txt 2>&1
Meaning:
>sends stdout tooutput.txt2>&1duplicates stderr to wherever stdout is currently pointing
Understanding redirection order
One confusing detail was execution order.
Redirections occur left → right.
Example:
Incorrect order:
ls /bad 2>&1 > file.txt
Correct order:
ls /bad > file.txt 2>&1
Why?
Because descriptors are duplicated at the moment they are evaluated.
Mini practical example
Example investigation-style capture:
grep "error" app.log > results.txt 2>&1
This ensures that both matching lines and possible parsing errors are captured in the same output file.
This pattern is useful when preserving evidence during analysis.
🔐 Key Cybersecurity Connections
Understanding command streams matters for:
- log parsing pipelines
- automated detection scripts
- forensic evidence collection
- error handling in analysis workflows
For example:
If a parsing tool writes errors to stderr and they are not captured, an analyst may falsely assume the command succeeded.
Capturing both streams prevents silent failures during investigation.
⚠️ Challenges
The main conceptual challenge was understanding how file descriptor duplication works.
At first it seemed like redirections happened simultaneously.
In reality, the shell rewires descriptors sequentially.
Once this model was understood, redirection behavior became predictable.
🧠 What I Learned
- Linux commands communicate through structured output streams.
- Pipelines pass stdout but ignore stderr unless redirected.
- Exit codes allow reliable automation decisions.
- Redirection order affects where data actually goes.
⏭️ Next Steps
- Combine
grep,find, and pipelines for investigation workflows. - Practice capturing outputs during log analysis.
- Begin integrating these concepts into small Bash scripts.
💭 Reflection
This session shifted my perspective from simply running commands to understanding how commands communicate and interact.
Once streams and exit codes are understood, the shell becomes a much more powerful environment for automation and analysis.
🧩 Lessons Learned
What worked
- experimenting interactively with commands
- intentionally triggering errors to observe behavior
What broke
- misunderstanding the order of redirection operations
Why it broke
- assuming redirections happen simultaneously
Fix / takeaway
Always remember that Bash evaluates redirections from left to right.
