Β· PathShield Team Β· Tutorials Β· 11 min read
Terraform Security Scanning in CI/CD - Catch Issues Before Deploy
Stop deploying vulnerable infrastructure. Learn how to integrate Terraform security scanning into your CI/CD pipeline with practical examples and scripts.
Terraform Security Scanning in CI/CD - Catch Issues Before Deploy
Your Terraform code creates an S3 bucket with public read access. Your security team finds it three weeks later during an audit. By then, youβve got compliance violations, audit findings, and a very unhappy CISO. This guide shows you how to catch these issues in your CI/CD pipeline, not in production.
Why Terraform Security Scanning Matters
The problem with post-deployment scanning:
- Security issues are already in production
- Fixing requires another deployment cycle
- Creates technical debt and risk exposure
- Teams resist fixing βworkingβ code
The power of pre-deployment scanning:
- Catch issues before they reach production
- Fix is just updating the code
- No downtime or rollback needed
- Teams build security habits naturally
The Complete Terraform Security Pipeline
Hereβs a production-ready CI/CD pipeline that scans Terraform code before deployment:
# .github/workflows/terraform-security.yml
name: Terraform Security Pipeline
on:
  pull_request:
    paths:
      - 'terraform/**'
      - '*.tf'
  push:
    branches: [main]
env:
  TF_VERSION: '1.6.0'
jobs:
  security-scan:
    name: Security Scan
    runs-on: ubuntu-latest
    
    steps:
      - name: Checkout
        uses: actions/checkout@v3
        
      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v2
        with:
          terraform_version: ${{ env.TF_VERSION }}
          
      - name: Terraform Init
        run: terraform init -backend=false
        
      - name: Terraform Validate
        run: terraform validate
        
      - name: Run Checkov
        uses: bridgecrewio/checkov-action@master
        with:
          directory: .
          framework: terraform
          output_format: sarif
          output_file_path: checkov-results.sarif
          
      - name: Run TFSec
        uses: aquasecurity/tfsec-action@v1.0.0
        with:
          soft_fail: false
          
      - name: Run Terrascan
        run: |
          curl -L "$(curl -s https://api.github.com/repos/tenable/terrascan/releases/latest | grep -o -E "https://.+?_Linux_x86_64.tar.gz")" > terrascan.tar.gz
          tar -xf terrascan.tar.gz terrascan && rm terrascan.tar.gz
          ./terrascan scan -t terraform -d .
          
      - name: Run Custom Security Checks
        run: |
          python scripts/custom_tf_security_scan.py
          
      - name: Upload SARIF Results
        uses: github/codeql-action/upload-sarif@v2
        if: always()
        with:
          sarif_file: checkov-results.sarif
          
      - name: Comment PR with Results
        if: github.event_name == 'pull_request'
        uses: actions/github-script@v6
        with:
          script: |
            const fs = require('fs');
            const results = fs.readFileSync('scan-results.txt', 'utf8');
            github.rest.issues.createComment({
              issue_number: context.issue.number,
              owner: context.repo.owner,
              repo: context.repo.repo,
              body: `## π Terraform Security Scan Results\n\n\`\`\`\n${results}\n\`\`\``
            });
  plan-and-deploy:
    name: Plan and Deploy
    runs-on: ubuntu-latest
    needs: security-scan
    if: github.ref == 'refs/heads/main'
    
    steps:
      - name: Checkout
        uses: actions/checkout@v3
        
      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          role-to-assume: ${{ secrets.AWS_ROLE_ARN }}
          aws-region: us-east-1
          
      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v2
        with:
          terraform_version: ${{ env.TF_VERSION }}
          
      - name: Terraform Plan
        run: |
          terraform init
          terraform plan -out=tfplan
          
      - name: Scan Terraform Plan
        run: |
          terraform show -json tfplan > tfplan.json
          python scripts/scan_terraform_plan.py tfplan.json
          
      - name: Terraform Apply
        if: success()
        run: terraform apply tfplanSecurity Scanning Tools Comparison
1. Checkov (Comprehensive)
# Install Checkov
pip install checkov
# Scan Terraform files
checkov -f main.tf --framework terraform
# Example output:
# Check: CKV_AWS_18: "Ensure the S3 bucket has access logging configured"
# FAILED for resource: aws_s3_bucket.example
# File: /main.tf:1-5Best for: Comprehensive policy coverage, custom policies, multiple IaC formats
2. TFSec (Fast & Focused)
# Install TFSec
brew install tfsec
# Scan current directory
tfsec .
# Example output:
# Result #1 HIGH S3 bucket is publicly readable
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
#   main.tf:10-14
# 
#   resource "aws_s3_bucket_public_access_block" "example" {
#     bucket = aws_s3_bucket.example.id
#   }Best for: Speed, Terraform-specific issues, CI/CD integration
3. Terrascan (Policy Engine)
# Install Terrascan
curl -L "$(curl -s https://api.github.com/repos/tenable/terrascan/releases/latest | grep -o -E "https://.+?_Linux_x86_64.tar.gz")" > terrascan.tar.gz
tar -xf terrascan.tar.gz terrascan
# Scan with custom policies
./terrascan scan -p policies/ -t terraformBest for: Custom policy engines, compliance frameworks, detailed reporting
Custom Security Scanning Script
#!/usr/bin/env python3
"""
Custom Terraform security scanner
Checks for startup-specific security issues
"""
import os
import re
import json
import sys
from pathlib import Path
class TerraformSecurityScanner:
    def __init__(self):
        self.findings = []
        self.critical_count = 0
        self.high_count = 0
        
    def scan_directory(self, directory='.'):
        """Scan all .tf files in directory"""
        tf_files = list(Path(directory).rglob('*.tf'))
        
        print(f"Scanning {len(tf_files)} Terraform files...")
        
        for tf_file in tf_files:
            self.scan_file(tf_file)
            
    def scan_file(self, file_path):
        """Scan individual Terraform file"""
        try:
            with open(file_path, 'r') as f:
                content = f.read()
                
            # Check for common security issues
            self.check_hardcoded_secrets(file_path, content)
            self.check_public_resources(file_path, content)
            self.check_encryption(file_path, content)
            self.check_network_security(file_path, content)
            self.check_iam_policies(file_path, content)
            
        except Exception as e:
            print(f"Error scanning {file_path}: {e}")
    
    def check_hardcoded_secrets(self, file_path, content):
        """Check for hardcoded secrets"""
        secret_patterns = [
            (r'password\s*=\s*"[^"]*"', 'Hardcoded password'),
            (r'secret_key\s*=\s*"[^"]*"', 'Hardcoded secret key'),
            (r'access_key\s*=\s*"AKIA[0-9A-Z]{16}"', 'AWS access key'),
            (r'private_key\s*=\s*"-----BEGIN', 'Private key in code'),
            (r'token\s*=\s*"[a-zA-Z0-9]{20,}"', 'Hardcoded token')
        ]
        
        for pattern, description in secret_patterns:
            if re.search(pattern, content, re.IGNORECASE):
                self.add_finding(
                    file_path, 'CRITICAL', 'Hardcoded Secret',
                    f"{description} found in {file_path}"
                )
    
    def check_public_resources(self, file_path, content):
        """Check for publicly accessible resources"""
        
        # S3 bucket public access
        if 'aws_s3_bucket_public_access_block' not in content:
            if 'aws_s3_bucket' in content:
                self.add_finding(
                    file_path, 'HIGH', 'S3 Security',
                    'S3 bucket without public access block'
                )
        
        # RDS public access
        if re.search(r'publicly_accessible\s*=\s*true', content):
            self.add_finding(
                file_path, 'CRITICAL', 'Database Security',
                'RDS instance configured as publicly accessible'
            )
        
        # Security groups with 0.0.0.0/0
        if re.search(r'cidr_blocks\s*=\s*\["0\.0\.0\.0/0"\]', content):
            self.add_finding(
                file_path, 'HIGH', 'Network Security',
                'Security group allows access from anywhere (0.0.0.0/0)'
            )
    
    def check_encryption(self, file_path, content):
        """Check for encryption configurations"""
        
        # S3 bucket encryption
        if 'aws_s3_bucket' in content and 'aws_s3_bucket_server_side_encryption_configuration' not in content:
            self.add_finding(
                file_path, 'MEDIUM', 'Encryption',
                'S3 bucket without server-side encryption'
            )
        
        # RDS encryption
        if 'aws_db_instance' in content and 'storage_encrypted = true' not in content:
            self.add_finding(
                file_path, 'MEDIUM', 'Encryption',
                'RDS instance without encryption at rest'
            )
        
        # EBS encryption
        if 'aws_instance' in content and 'encrypted = true' not in content:
            self.add_finding(
                file_path, 'MEDIUM', 'Encryption',
                'EC2 instance without encrypted EBS volumes'
            )
    
    def check_network_security(self, file_path, content):
        """Check network security configurations"""
        
        # VPC configuration
        if 'aws_instance' in content and 'subnet_id' not in content:
            self.add_finding(
                file_path, 'MEDIUM', 'Network Security',
                'EC2 instance without explicit subnet configuration'
            )
        
        # Default security group usage
        if re.search(r'security_groups\s*=\s*\["default"\]', content):
            self.add_finding(
                file_path, 'LOW', 'Network Security',
                'Using default security group'
            )
    
    def check_iam_policies(self, file_path, content):
        """Check IAM policy configurations"""
        
        # Overly permissive policies
        if re.search(r'"Action"\s*:\s*"\*"', content):
            self.add_finding(
                file_path, 'HIGH', 'IAM Security',
                'IAM policy with wildcard actions (*)'
            )
        
        if re.search(r'"Resource"\s*:\s*"\*"', content):
            self.add_finding(
                file_path, 'MEDIUM', 'IAM Security',
                'IAM policy with wildcard resources (*)'
            )
        
        # Root account usage
        if re.search(r'"Principal"\s*:\s*"\*"', content):
            self.add_finding(
                file_path, 'CRITICAL', 'IAM Security',
                'Policy allows access from any principal (*)'
            )
    
    def add_finding(self, file_path, severity, category, description):
        """Add security finding"""
        self.findings.append({
            'file': str(file_path),
            'severity': severity,
            'category': category,
            'description': description
        })
        
        if severity == 'CRITICAL':
            self.critical_count += 1
        elif severity == 'HIGH':
            self.high_count += 1
    
    def generate_report(self):
        """Generate security report"""
        print(f"\n{'='*80}")
        print("TERRAFORM SECURITY SCAN RESULTS")
        print(f"{'='*80}")
        
        if not self.findings:
            print("β
 No security issues found!")
            return True
        
        print(f"Total findings: {len(self.findings)}")
        print(f"Critical: {self.critical_count}")
        print(f"High: {self.high_count}")
        
        # Group findings by severity
        by_severity = {}
        for finding in self.findings:
            severity = finding['severity']
            if severity not in by_severity:
                by_severity[severity] = []
            by_severity[severity].append(finding)
        
        # Show critical and high severity issues
        for severity in ['CRITICAL', 'HIGH']:
            if severity in by_severity:
                print(f"\n{severity} SEVERITY:")
                for finding in by_severity[severity]:
                    print(f"  π {finding['file']}")
                    print(f"     {finding['category']}: {finding['description']}")
        
        # Fail if critical or high severity issues found
        return self.critical_count == 0 and self.high_count == 0
    
    def generate_sarif(self, output_file='terraform-security.sarif'):
        """Generate SARIF output for GitHub integration"""
        sarif_output = {
            "version": "2.1.0",
            "runs": [{
                "tool": {
                    "driver": {
                        "name": "Custom Terraform Scanner",
                        "version": "1.0.0"
                    }
                },
                "results": []
            }]
        }
        
        for finding in self.findings:
            result = {
                "ruleId": finding['category'].replace(' ', '_').lower(),
                "level": finding['severity'].lower(),
                "message": {
                    "text": finding['description']
                },
                "locations": [{
                    "physicalLocation": {
                        "artifactLocation": {
                            "uri": finding['file']
                        }
                    }
                }]
            }
            sarif_output["runs"][0]["results"].append(result)
        
        with open(output_file, 'w') as f:
            json.dump(sarif_output, f, indent=2)
        
        print(f"SARIF report generated: {output_file}")
def main():
    scanner = TerraformSecurityScanner()
    scanner.scan_directory()
    
    success = scanner.generate_report()
    scanner.generate_sarif()
    
    # Exit with error code if issues found
    sys.exit(0 if success else 1)
if __name__ == '__main__':
    main()Terraform Plan Security Scanning
Scan the actual deployment plan, not just the code:
#!/usr/bin/env python3
"""
Terraform Plan Security Scanner
Analyzes terraform plan JSON for security issues
"""
import json
import sys
def scan_terraform_plan(plan_file):
    """Scan terraform plan for security issues"""
    
    with open(plan_file, 'r') as f:
        plan = json.load(f)
    
    findings = []
    
    for change in plan.get('resource_changes', []):
        resource_type = change.get('type')
        resource_name = change.get('name')
        change_actions = change.get('change', {}).get('actions', [])
        
        if 'create' not in change_actions and 'update' not in change_actions:
            continue
            
        after_values = change.get('change', {}).get('after', {})
        
        # Check S3 buckets
        if resource_type == 'aws_s3_bucket':
            # Check for versioning
            if not after_values.get('versioning', {}).get('enabled'):
                findings.append({
                    'resource': f"{resource_type}.{resource_name}",
                    'severity': 'MEDIUM',
                    'issue': 'S3 bucket without versioning'
                })
        
        # Check RDS instances
        elif resource_type == 'aws_db_instance':
            if after_values.get('publicly_accessible'):
                findings.append({
                    'resource': f"{resource_type}.{resource_name}",
                    'severity': 'CRITICAL',
                    'issue': 'RDS instance publicly accessible'
                })
            
            if not after_values.get('storage_encrypted'):
                findings.append({
                    'resource': f"{resource_type}.{resource_name}",
                    'severity': 'HIGH',
                    'issue': 'RDS instance without encryption'
                })
        
        # Check security groups
        elif resource_type == 'aws_security_group':
            ingress_rules = after_values.get('ingress', [])
            for rule in ingress_rules:
                cidr_blocks = rule.get('cidr_blocks', [])
                if '0.0.0.0/0' in cidr_blocks:
                    from_port = rule.get('from_port', 0)
                    to_port = rule.get('to_port', 65535)
                    
                    if from_port in [22, 3389, 3306, 5432]:  # Risky ports
                        findings.append({
                            'resource': f"{resource_type}.{resource_name}",
                            'severity': 'CRITICAL',
                            'issue': f'Security group allows {from_port}-{to_port} from anywhere'
                        })
    
    # Report findings
    if findings:
        print(f"π¨ Found {len(findings)} security issues in Terraform plan:")
        
        critical = [f for f in findings if f['severity'] == 'CRITICAL']
        if critical:
            print(f"\nCRITICAL Issues ({len(critical)}):")
            for finding in critical:
                print(f"  β {finding['resource']}: {finding['issue']}")
            
            print("\nβ Deployment blocked due to critical security issues!")
            sys.exit(1)
        
        high = [f for f in findings if f['severity'] == 'HIGH']
        if high:
            print(f"\nHIGH Issues ({len(high)}):")
            for finding in high:
                print(f"  β οΈ  {finding['resource']}: {finding['issue']}")
    
    else:
        print("β
 No security issues found in Terraform plan")
if __name__ == '__main__':
    if len(sys.argv) != 2:
        print("Usage: python scan_terraform_plan.py <plan.json>")
        sys.exit(1)
    
    scan_terraform_plan(sys.argv[1])Policy as Code Examples
Custom Checkov Policy
# policies/S3BucketMustHaveNotificationConfiguration.py
from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
from checkov.common.models.enums import Evaluations
class S3BucketNotificationConfiguration(BaseResourceCheck):
    def __init__(self):
        name = "S3 bucket must have notification configuration"
        id = "CKV_AWS_CUSTOM_1"
        supported_resources = ['aws_s3_bucket']
        categories = []
        super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
    def scan_resource_conf(self, conf):
        """
        Looks for notification configuration on S3 buckets
        """
        if 'notification' in conf:
            return CheckResult.PASSED
        return CheckResult.FAILED
check = S3BucketNotificationConfiguration()OPA Rego Policy for Terrascan
# policies/s3_encryption.rego
package accurics
s3_bucket_encryption_missing[retVal] {
    resource_type := "aws_s3_bucket"
    resource := input.resource_changes[_]
    resource.type == resource_type
    
    # Check if server-side encryption is configured
    not resource.change.after.server_side_encryption_configuration
    
    retVal := {
        "Id": "S3.1",
        "ReplaceType": "add",
        "CodeType": "resource",
        "Traverse": sprintf("resource_changes[%d].change.after.server_side_encryption_configuration", [i]),
        "Attribute": "server_side_encryption_configuration",
        "AttributeDataType": "list",
        "Expected": [{"rule": [{"apply_server_side_encryption_by_default": [{"sse_algorithm": "AES256"}]}]}],
        "Actual": null
    }
}Pre-commit Hooks for Terraform Security
# .pre-commit-config.yaml
repos:
  - repo: https://github.com/pre-commit/pre-commit-hooks
    rev: v4.4.0
    hooks:
      - id: trailing-whitespace
      - id: end-of-file-fixer
      - id: check-yaml
      - id: check-json
      
  - repo: https://github.com/antonbabenko/pre-commit-terraform
    rev: v1.81.0
    hooks:
      - id: terraform_fmt
      - id: terraform_validate
      - id: terraform_tflint
      - id: terraform_checkov
        args: [--args=--framework terraform]
      - id: terraform_tfsec
        
  - repo: local
    hooks:
      - id: custom-terraform-security
        name: Custom Terraform Security Scan
        entry: python scripts/custom_tf_security_scan.py
        language: python
        files: \.tf$
        pass_filenames: falseIntegration with Popular CI/CD Platforms
GitLab CI
# .gitlab-ci.yml
stages:
  - validate
  - security-scan
  - deploy
terraform-security:
  stage: security-scan
  image: bridgecrew/checkov:latest
  script:
    - checkov --framework terraform --output cli --output json --output-file-path checkov-report.json .
  artifacts:
    reports:
      junit: checkov-report.json
    paths:
      - checkov-report.json
  allow_failure: falseJenkins Pipeline
pipeline {
    agent any
    
    stages {
        stage('Terraform Security Scan') {
            parallel {
                stage('Checkov') {
                    steps {
                        sh 'checkov -f . --framework terraform --output cli'
                    }
                }
                stage('TFSec') {
                    steps {
                        sh 'tfsec . --format json --out tfsec-results.json'
                        publishHTML([
                            allowMissing: false,
                            alwaysLinkToLastBuild: false,
                            keepAll: true,
                            reportDir: '.',
                            reportFiles: 'tfsec-results.json',
                            reportName: 'TFSec Report'
                        ])
                    }
                }
                stage('Custom Scan') {
                    steps {
                        sh 'python scripts/custom_tf_security_scan.py'
                    }
                }
            }
        }
        
        stage('Deploy') {
            when {
                branch 'main'
            }
            steps {
                sh 'terraform init'
                sh 'terraform plan -out=tfplan'
                sh 'python scripts/scan_terraform_plan.py tfplan.json'
                sh 'terraform apply tfplan'
            }
        }
    }
}Handling False Positives
Terraform Comments for Exceptions
# Checkov skip comment
resource "aws_s3_bucket" "example" {
  #checkov:skip=CKV_AWS_18:This bucket is for public static content
  bucket = "my-public-static-content"
}
# TFSec skip comment  
resource "aws_security_group" "web" {
  #tfsec:ignore:aws-vpc-no-public-ingress-sgr
  ingress {
    from_port   = 80
    to_port     = 80
    protocol    = "tcp"
    cidr_blocks = ["0.0.0.0/0"]  # Intentionally public for web traffic
  }
}Configuration Files for Exceptions
# .checkov.yml
framework:
  - terraform
skip-check:
  - CKV_AWS_18  # S3 bucket logging not required for static content
  - CKV_AWS_52  # S3 bucket MFA delete not required in dev
  
# .tfsec.yml
exclude:
  - aws-s3-enable-bucket-logging
  - aws-vpc-no-public-ingress-sgrMonitoring and Alerting
# terraform_security_monitor.py
import requests
import json
def send_slack_alert(findings):
    """Send security findings to Slack"""
    
    critical_count = len([f for f in findings if f['severity'] == 'CRITICAL'])
    
    if critical_count > 0:
        webhook_url = os.environ['SLACK_WEBHOOK_URL']
        
        message = {
            "text": f"π¨ Critical Terraform security issues found!",
            "attachments": [{
                "color": "danger",
                "fields": [
                    {
                        "title": "Critical Issues",
                        "value": str(critical_count),
                        "short": True
                    },
                    {
                        "title": "Repository", 
                        "value": os.environ.get('GITHUB_REPOSITORY', 'Unknown'),
                        "short": True
                    }
                ]
            }]
        }
        
        requests.post(webhook_url, json=message)
def create_jira_ticket(findings):
    """Create JIRA ticket for security findings"""
    
    critical_findings = [f for f in findings if f['severity'] == 'CRITICAL']
    
    if critical_findings:
        # JIRA API integration code here
        passConclusion
Terraform security scanning in CI/CD isnβt optionalβitβs essential. The cost of fixing security issues in code is minimal compared to fixing them in production. Start with the tools and scripts in this guide, and gradually build a comprehensive security pipeline.
Your implementation roadmap:
- Week 1: Add basic TFSec or Checkov to your CI/CD
- Week 2: Implement the custom security scanner
- Week 3: Add Terraform plan scanning
- Week 4: Set up alerts and dashboards
Remember: The goal isnβt perfect securityβitβs catching obvious issues before they become production problems. These tools and practices will prevent 95% of infrastructure security incidents.
Want automated Terraform security scanning without managing multiple tools? Modern platforms like PathShield can integrate with your CI/CD pipeline and provide comprehensive infrastructure security scanning with a single integration.