Skip to main content

Command Palette

Search for a command to run...

πŸ“˜ Terraform Series – Day 12

Published
β€’5 min read
πŸ“˜ Terraform Series – Day 12
G

Gujjar Apurv is a passionate DevOps Engineer in the making, dedicated to automating infrastructure, streamlining software delivery, and building scalable cloud-native systems. With hands-on experience in tools like AWS, Docker, Kubernetes, Jenkins, Git, and Linux, he thrives at the intersection of development and operations. Driven by curiosity and continuous learning, Apurv shares insights, tutorials, and real-world solutions from his journeyβ€”making complex tech simple and accessible. Whether it's writing YAML, scripting in Python, or deploying on the cloud, he believes in doing it the right way. "Infrastructure is code, but reliability is art."

Secure State Management (S3 + DynamoDB Locking)

πŸ“ Abstract

In Terraform, the state file (terraform.tfstate) is the most critical component that connects your configuration with real infrastructure. However, storing it locally can lead to security risks, data loss, and team conflicts.

This blog explains how to securely manage Terraform state using AWS S3 (remote storage) and DynamoDB (state locking), which is the industry-standard approach for production environments.

🎯 Objectives

After completing this blog, you will be able to:

  • Understand Terraform state and its importance

  • Know why .tfstate should never be pushed to GitHub

  • Handle state loss scenarios

  • Understand state conflicts in team environments

  • Implement remote backend using S3 + DynamoDB

  • Test state locking in real scenarios

πŸ”· Step 1: What is Terraform State?

Terraform maintains a file:

terraform.tfstate

🧠 This file stores:

  • Real infrastructure details

  • Resource IDs and attributes

  • Mapping between Terraform code ↔ AWS resources

πŸ”· Step 2: Should You Push .tfstate to GitHub?

πŸ‘‰ ❌ NO β€” Never do this

⚠️ Why?

Because it contains:

  • Secrets (API keys, credentials)

  • Resource IDs

  • Internal infrastructure data

πŸ‘‰ This can lead to security breaches

βœ… Add to .gitignore

*.tfstate
*.tfstate.backup

πŸ”· Step 3: What if .tfstate is Deleted?

πŸ‘‰ Terraform loses tracking of infrastructure

❗ Result:

  • Terraform thinks β†’ nothing exists

  • Next terraform apply β†’ tries to recreate everything ❌

βœ… Solutions:

  • Restore from backup (.tfstate.backup)

  • Use remote backend (best practice)

πŸ”· Step 4: State Conflict (Very Important)

πŸ”Ή Scenario:

  • Developer 1 β†’ runs terraform apply

  • Developer 2 β†’ runs terraform apply

❗ What Happens?

  • Both modify same state file

  • File gets overwritten or corrupted

πŸ‘‰ This is called State Conflict

πŸ”· Step 5: Solutions

❌ Local Shared State

  • Not safe

  • Not scalable

βœ… Remote Backend (Best Practice)

Use:

  • S3 Bucket β†’ Store state file

  • DynamoDB β†’ Lock state

πŸ”· Step 6: Architecture Flow

🧠 Working:

  1. Terraform stores state in S3

  2. Before update β†’ checks DynamoDB

  3. If no lock β†’ creates LockID

  4. While locked β†’ ❌ no parallel execution

  5. After completion β†’ lock removed

πŸ”· Step 7: Practical Implementation

πŸ“ Step 1: Create Project Folder

mkdir remote-infra
cd remote-infra

πŸ“„ Step 2: Create Files

touch provider.tf terraform.tf s3.tf dynamodb.tf

πŸ”§ Step 3: Provider Configuration

provider "aws" {
  region = "us-east-2"
}

πŸ“¦ Step 4: Terraform Block

terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "~> 6.0"
    }
  }
}

πŸͺ£ Step 5: Create S3 Bucket

resource "random_id" "suffix" {
  byte_length = 2
}

resource "aws_s3_bucket" "remote_s3" {
  bucket = "dev-tf-state-${random_id.suffix.hex}"

  tags = {
    Name        = "tf-state-bucket"
    Environment = "dev"
  }
}

πŸ” Step 6: Create DynamoDB Table

resource "aws_dynamodb_table" "state_lock" {
  name         = "apurv-table"
  billing_mode = "PAY_PER_REQUEST"
  hash_key     = "LockID"

  attribute {
    name = "LockID"
    type = "S"
  }

  tags = {
    Name        = "apurv-table"
    Environment = "Dev"
  }
}

πŸ”‘ Step 7: IAM Permissions

Ensure your AWS user/role has:

  • S3 Full Access

  • DynamoDB Full Access

πŸ”· Step 8: Run Terraform

terraform init
terraform validate
terraform plan
terraform apply

πŸ”· Step 9: Configure Remote Backend

Now go to your main project folder and update:

terraform {
  backend "s3" {
    bucket         = "bucket<name>"
    key            = "terraform.tfstate"
    region         = "us-east-2"
    dynamodb_table = "apurv-table"
  }
}

πŸ”„ Reinitialize

terraform init

πŸ”· Step 10: Remove Local State

rm terraform.tfstate*

βœ… Verify Remote State

terraform state list

πŸ‘‰ Resources will still appear
βœ” Because state is now stored in S3

πŸ”· Step 11: Test State Locking

Terminal 1:

terraform apply

Terminal 2:

terraform apply

❗ Result:

  • Terminal 2 β†’ ❌ blocked / waits

  • Reason β†’ Lock exists in DynamoDB

βœ” After Completion:

  • Lock is removed

  • Second execution proceeds

  • After testing all the things you can destroy your resources

πŸš€ Conclusion

  • Terraform state is critical for infrastructure tracking

  • Never store state locally in production

  • Use S3 for storage + DynamoDB for locking

  • Prevents:

    • Data loss

    • State conflicts

    • Security risks

πŸ‘¨β€πŸ’» About the Author

β€œA complete Terraform series covering everything from fundamentals to advanced real-world infrastructure automation in a DevOps environment.”

πŸ“¬ Let's Stay Connected

Terraform

Part 1 of 12

πŸš€ Terraform Series – Automate Your Infrastructure Starting a complete **Terraform series** where I’ll cover everything from **basic to advanced level** with real-world practicals. In this series, you will learn: β€’ What is Infrastructure as Code (IaC) & why it matters β€’ Terraform fundamentals (providers, resources, state) β€’ Writing and managing Terraform configurations β€’ Variables, outputs & modules β€’ Remote state & state management β€’ Provisioning infrastructure on AWS β€’ Automation & real-world use cases β€’ Advanced concepts like workspaces, modules, and best practices 🎯 Goal: Help you automate infrastructure and become job-ready in DevOps. Perfect for **beginners, students, and DevOps learners** who want hands-on experience. Stay tuned and let’s build infrastructure the smart way βš‘πŸ’» #Terraform #DevOps #Cloud #AWS #InfrastructureAsCode #Automation

Up next

πŸ“˜ Terraform Series – Day 11

Terraform State Management & Import πŸ“ Abstract Terraform works by maintaining a record of infrastructure in a state file. This state file acts as the bridge between your Terraform configuration and r