🟢 Node.js Fundamentals
1. what node.js actually is
node.js is a javascript runtime built on chrome's V8 engine. it allows you to run javascript outside the browser — on servers, scripts, CLI tools, etc.
it is not a framework. it is not a language. it is a runtime environment.
2. why node.js was created
before node.js (2009), server-side was dominated by PHP, Java, Ruby, Python. ryan dahl created node.js to solve problems in traditional web servers:
the problem with traditional servers (thread-per-request):
request 1 → thread 1 → waits for DB → responds
request 2 → thread 2 → waits for DB → responds
...
request 10,000 → ❌ no threads left → connection refused
each request spawned a new thread. threads are expensive (~2MB each), and most of the time they're just sitting idle, waiting for I/O.
what node.js does differently:
request 1 → start DB query → move on
request 2 → start DB query → move on
request 10,000 → start API call → move on
DB query 1 done → callback runs → respond to request 1
one thread handles everything — it never waits. it delegates I/O to the OS and moves to the next task.
3. what problem node.js solves
node.js is designed for I/O-bound, event-driven applications:
- API servers — thousands of requests that mostly wait for databases
- real-time apps — chat, live dashboards (websockets)
- microservices — lightweight services that talk to each other
- integration layers — unified APIs aggregating multiple sources
- streaming — processing data in chunks
what node.js is NOT good for:
CPU-intensive tasks — video encoding, heavy computation. while CPU is busy, the event loop is blocked.
insight: when someone says "node doesn't scale" — they usually mean CPU-bound work. for I/O-bound work (90% of web apps), node scales extremely well.
4. core runtime concepts
key characteristics:
- single-threaded — one main thread of execution
- event-driven — uses an event loop to handle async operations
- non-blocking I/O — file reads, network calls don't block the main thread
what blocks the event loop (avoid):
// ❌ synchronous file read in a server
const data = fs.readFileSync("huge-file.json"); // blocks EVERYTHING
// ❌ heavy computation
for (let i = 0; i < 1_000_000_000; i++) {} // blocks EVERYTHING
// ✅ use async alternatives
const data = await fs.promises.readFile("huge-file.json");
5. streams (senior-level)
streams let you process data piece by piece instead of loading everything into memory.
// ❌ loading a 2GB file into memory
const data = await fs.promises.readFile("huge-log.txt", "utf-8");
// ✅ streaming — processes line by line
import { createReadStream } from "fs";
import { createInterface } from "readline";
const stream = createReadStream("huge-log.txt");
const rl = createInterface({ input: stream });
rl.on("line", (line) => {
if (line.includes("ERROR")) console.log(line);
});
piping streams:
createReadStream("input.txt")
.pipe(createGzip())
.pipe(createWriteStream("input.txt.gz"));
6. file system (fs)
import { readFile as readFilePromise } from "fs/promises";
// async with promises (modern, preferred)
const data = await readFilePromise("config.json", "utf-8");
// writing
import { writeFile } from "fs/promises";
await writeFile("output.json", JSON.stringify({ name: "olga" }), "utf-8");
7. working with APIs
POST request with auth
const response = await fetch("https://api.example.com/users", {
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": `Bearer ${token}`,
},
body: JSON.stringify({ name: "olga", email: "olga@test.com" }),
});
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
const data = await response.json();
common auth patterns
// bearer token
headers: { "Authorization": `Bearer ${token}` }
// API key
headers: { "X-API-Key": apiKey }
// basic auth
headers: {
"Authorization": `Basic ${Buffer.from(`${user}:${pass}`).toString("base64")}`
}
8. express basics
routes + middleware
import express from "express";
const app = express();
app.use(express.json());
app.get("/api/users", (req, res) => {
res.json({ users: [] });
});
app.post("/api/users", (req, res) => {
const { name, email } = req.body;
res.status(201).json({ id: "123", name, email });
});
// auth middleware
function authenticate(req, res, next) {
const token = req.headers.authorization?.split(" ")[1];
if (!token) return res.status(401).json({ error: "no token" });
req.user = verifyToken(token);
next();
}
// error handling (must have 4 params)
app.use((err, req, res, next) => {
console.error(err.stack);
res.status(500).json({ error: "something went wrong" });
});
9. environment variables
const apiKey = process.env.API_KEY;
const port = process.env.PORT || 3000;
# .env
API_KEY=sk-1234567890
DATABASE_URL=postgresql://user:pass@localhost:5432/mydb
- never commit .env files — add to .gitignore
- never log secrets
10. debugging
reading stack traces (bottom to top)
TypeError: Cannot read properties of undefined (reading 'name')
at getUser (/app/src/users.js:15:23)
at processRequest (/app/src/handler.js:42:10)
at /app/src/server.js:8:5
effective logging
// ❌ useless
console.log(data);
// ✅ useful
console.log("[getUser] response:", JSON.stringify(data, null, 2));
11. error handling patterns (senior-level)
custom error classes
class AppError extends Error {
constructor(message, statusCode, code) {
super(message);
this.statusCode = statusCode;
this.code = code;
}
}
class NotFoundError extends AppError {
constructor(resource) {
super(`${resource} not found`, 404, "NOT_FOUND");
}
}
graceful shutdown
process.on("SIGTERM", async () => {
console.log("shutting down gracefully...");
server.close();
await db.disconnect();
process.exit(0);
});
12. when to use node.js
- ✅ REST/GraphQL APIs
- ✅ real-time apps (websockets)
- ✅ microservices and integration layers
- ✅ serverless functions
- ✅ CLI tools
- ❌ heavy computation (use Go, Rust)
- ❌ memory-constrained environments