🛢Databases & ORM Integration

Integrate databases with Node.js using raw drivers and ORMs.

🧪 Connecting to MongoDB & Mongoose

In this chapter, we'll explore how to connect your Node.js application to MongoDB using Mongoose, the most popular ORM (Object-Relational Mapping) library for Node.js. We'll cover everything from setting up the connection to performing CRUD operations and managing relationships between documents.

💡 Why Use Mongoose?

  • Provides a clear structure for your MongoDB data with schemas
  • Includes built-in validation mechanisms
  • Simplifies relationships between documents
  • Adds query middleware and hooks for additional functionality

Setting Up the Connection

const mongoose = require('mongoose');

// Connect to MongoDB
async function connectDB() {
  try {
    await mongoose.connect('mongodb://localhost:27017/myapp', {
      useNewUrlParser: true,
      useUnifiedTopology: true
    });
    console.log('Connected to MongoDB!');
  } catch (error) {
    console.error('Connection failed:', error);
  }
}

connectDB();

Replace localhost:27017/myapp with your actual MongoDB connection string. If you're using a cloud service like MongoDB Atlas, use the appropriate connection URI.

Creating Schemas and Models

Mongoose schemas define the structure of your data. Here's an example of a user schema:

const userSchema = new mongoose.Schema({
  name: {
    type: String,
    required: true
  },
  email: {
    type: String,
    required: true,
    unique: true
  },
  password: {
    type: String,
    required: true,
    select: false // Never return this field in queries
  },
  createdAt: {
    type: Date,
    default: Date.now
  }
});

Basic CRUD Operations

Let's look at how to perform basic CRUD operations using Mongoose:

// Create a new user
const createUser = async () => {
  const user = new User({
    name: 'John Doe',
    email: 'john@example.com',
    password: 'securepassword123'
  });

  try {
    await user.save();
    console.log('User created successfully!');
  } catch (error) {
    console.error('Error creating user:', error);
  }
};
// Find all users
const findAllUsers = async () => {
  try {
    const users = await User.find().select('-password');
    console.log(users);
  } catch (error) {
    console.error('Error fetching users:', error);
  }
};
// Update a user
const updateUser = async () => {
  try {
    const updatedUser = await User.findByIdAndUpdate(
      'user-id-here',
      { name: 'John Updated' },
      { new: true }
    );
    console.log(updatedUser);
  } catch (error) {
    console.error('Error updating user:', error);
  }
};
// Delete a user
const deleteUser = async () => {
  try {
    await User.findByIdAndDelete('user-id-here');
    console.log('User deleted successfully!');
  } catch (error) {
    console.error('Error deleting user:', error);
  }
};

Handling Relationships

Mongoose supports relationships between documents through references. Here's an example of a post schema with a user reference:

const postSchema = new mongoose.Schema({
  title: {
    type: String,
    required: true
  },
  content: {
    type: String,
    required: true
  },
  author: {
    type: mongoose.Schema.Types.ObjectId,
    ref: 'User'
  },
  comments: [
    {
      type: mongoose.Schema.Types.ObjectId,
      ref: 'Comment'
    }
  ]
});

To populate these references, use the populate() method:

const getPostWithAuthor = async () => {
  try {
    const post = await Post.findById('post-id-here')
      .populate('author');
    console.log(post);
  } catch (error) {
    console.error('Error fetching post:', error);
  }
};

💡 Validation and Indexing

Mongoose provides built-in validation through schema definitions. Here are some common validators:

  • required: true - Field must be provided
  • unique: true - Ensures field values are unique across the collection
  • minlength: number and maxlength: number - Enforces string length constraints
  • enum: array - Restricts values to a specified list
// Example schema with validation
const blogSchema = new mongoose.Schema({
  title: {
    type: String,
    required: true,
    minlength: 5,
    maxlength: 100
  },
  status: {
    type: String,
    enum: ['DRAFT', 'PUBLISHED'],
    default: 'DRAFT'
  }
});

Error Handling

Always handle potential errors when working with databases. Here's an example of error handling in a middleware:

// Middleware for error handling
function errorHandler(err, req, res, next) {
  console.error('Error:', err);
  
  if (err.name === 'ValidationError') {
    return res.status(400).send({
      success: false,
      errors: Object.keys(err.errors).map(key => ({
        field: key,
        message: err.errors[key].message
      }))
    });
  }

  if (err.name === 'CastError') {
    return res.status(400).send({
      success: false,
      message: 'Invalid ID format'
    });
  }

  res.status(500).send({
    success: false,
    message: 'Internal server error'
  });
}

💡 Best Practices for Working with Mongoose

  • Always use async/await for database operations to make your code more readable
  • Use indexes wisely to improve query performance
  • Avoid circular references in relationships
  • Keep your schemas clean and modular
  • Test your validation rules thoroughly

💡 Summary

In this chapter, we've covered the fundamentals of connecting to MongoDB using Mongoose in a Node.js application. You now know how to define schemas, perform CRUD operations, manage relationships between documents, and handle errors effectively.

🧮 PostgreSQL with pg and Prisma

Welcome to PostgreSQL with `pg` and Prisma! In this chapter, you'll learn how to work with one of the most popular relational databases using two powerful tools: the `pg` driver and the Prisma ORM. By the end of this section, you'll be able to confidently use both libraries to build robust applications.

💡 Why PostgreSQL?

  • Scalability: Handles large datasets and high traffic with ease.
  • ACID Compliance: Ensures data consistency and reliability.
  • Rich Feature Set: Extensions, triggers, and advanced querying capabilities.

Getting Started with `pg`

The `pg` driver provides a low-level interface for connecting to PostgreSQL. Here's how to get started:

const { Pool } = require('pg');

const pool = new Pool({
  user: 'postgres',
  host: 'localhost',
  database: 'mydb',
  password: 'password',
  port: 5432
});

pool.query('SELECT * FROM users', (err, res) => {
  if (err) throw err;
  console.log(res.rows);
});

💡 Key Points About `pg`

  • Use connection pools for better performance.
  • Always handle errors with appropriate error handling.
  • Avoid using raw SQL queries directly in production.

Introduction to Prisma ORM

Prisma is a modern ORM that simplifies database operations. It provides a type-safe and developer-friendly API.

// prisma.schema

datasource db {
  provider = "postgresql"
  url      = env("DATABASE_URL")
}

model User {
  id       Int     @id
  email    String  @unique
  password String
}

Using the Prisma client:

import { PrismaClient } from '@prisma/client';

const prisma = new PrismaClient();

async function createUser(email, password) {
  const user = await prisma.user.create({
    data: {
      email,
      password
    }
  });
  return user;
}

💡 Migrations with Prisma & `pg`

Migrations are essential for maintaining consistent database schemas across environments.

// Using pg

const { Client } = require('pg');

async function migrate() {
  const client = new Client({
    user: 'postgres',
    host: 'localhost',
    database: 'mydb',
    password: 'password',
    port: 5432
  });
  try {
    await client.connect();
    const res = await client.query(
      'CREATE TABLE IF NOT EXISTS users (id SERIAL PRIMARY KEY, email VARCHAR(255) UNIQUE)');
    console.log('Table created');
  } catch (err) {
    console.error(err);
  } finally {
    await client.end();
  }
}

// Using Prisma
$npx prisma migrate dev --name initial_migration

Data Validation & Constraints

Validating data is crucial for maintaining data integrity. Use PostgreSQL constraints and Prisma's validation features.

// Database constraints in pg
ALTER TABLE users ADD CONSTRAINT unique_email UNIQUE (email);

// Validation with Prisma
const createUser = await prisma.user.create({
  data: {
    email,
    password,
  },
  validate: true
});

💡 Best Practices for PostgreSQL Development

  • Use indexing wisely to optimize queries.
  • Always use parameterized queries to prevent SQL injection.
  • Monitor database performance with tools like pgAdmin or Postgres.app.

Real-World Application Integration

Combining `pg` and Prisma can give you the best of both worlds. Use `pg` for raw database operations and Prisma for higher-level abstractions.

// Example integration
const { Pool } = require('pg');
import { PrismaClient } from '@prisma/client';

const pool = new Pool({
  user: 'postgres',
  database: 'myapp',
  // ... other config
});

const prisma = new PrismaClient();

async function getUserByEmail(email) {
  try {
    const result = await prisma.user.findUnique({
      where: { email },
      select: { id: true, email: true }
    });
    return result;
  } catch (err) {
    console.error('Error fetching user:', err);
  }
}

Common Mistakes to Avoid

  • Don't use raw SQL queries without proper sanitization.
  • Avoid overusing database connections - always pool them.
  • Don't ignore database performance metrics.

🔁 Transactions & Performance Tips

Welcome to our chapter on Transactions & Performance Tips! In this section, we'll explore how to optimize your Node.js applications when working with databases. You'll learn about connection pooling, indexing, and best practices for using transactions effectively.

💡 Understanding Database Connections

Database connections are critical resources in any Node.js application. Understanding how to manage them efficiently can have a significant impact on your application's performance.

💡 Connection Pooling Best Practices

  • Always use connection pooling to manage database connections.
  • Configure your pool size based on your application's load and requirements.
  • Avoid creating new connections for each request - reuse pooled connections.
  • Implement proper connection cleanup to prevent resource leaks.
const pool = mysql.createPool({
  host: 'localhost',
  user: 'root',
  password: 'password',
  database: 'testdb',
  connectionLimit: 10
});

💡 The Power of Indexing

Proper indexing is essential for optimizing database queries. Without indexes, your database engine may perform full table scans, leading to poor performance.

  • Index frequently queried fields to improve lookup times.
  • Avoid over-indexing - too many indexes can degrade write performance.
  • Monitor query execution plans to identify missing or unused indexes.
  • Use composite indexes when querying on multiple columns.
CREATE INDEX idx_user_name ON users(name);

💡 Mastering Transactions

Transactions allow you to execute multiple database operations as a single unit of work. They provide atomicity, consistency, isolation, and durability (ACID properties).

  • Use transactions when performing operations that must all succeed or fail together.
  • Set appropriate isolation levels to prevent common issues like dirty reads.
  • Always have a rollback strategy in case of errors or exceptions.
  • Keep transactions short to avoid locks and improve concurrency.
pool.getConnection((err, connection) => {
  if (err) throw err;

  const query = 'START TRANSACTION;';
  connection.query(query, function(err, results) {
    // Execute transaction operations here
    const commitQuery = 'COMMIT;';
    connection.query(commitQuery, function(err, results) {
      connection.release();
      if (err) throw err;
    });
  });
});

💡 Performance Optimization Tips

  • Use query caching for frequently executed queries.
  • Optimize your database schema - use appropriate data types and normalize where needed.
  • Minimize the amount of data transferred over the network.
  • Monitor and tune your database regularly.

Common Pitfalls to Avoid

  • Don't use raw SQL queries without proper escaping - risk SQL injection.
  • Avoid long-running transactions that can block other operations.
  • Don't ignore connection pool configuration - it's crucial for performance.
  • Don't forget to release connections back to the pool after use.

💡 Summary

In this chapter, we've covered essential topics for optimizing your Node.js database applications. By properly managing connections, implementing effective indexing strategies, and using transactions wisely, you can significantly improve your application's performance and reliability.

Quiz

Question 1 of 14

Which method is used to connect to MongoDB in Mongoose?

  • connect()
  • createConnection()
  • setupDB()
  • mongoose.connect()