
Cloud
Learning Level
Google Cloud Storage is a powerful, scalable object storage service perfect for storing files, images, videos, and any unstructured data. It can also host static websites and serve content globally using Cloud CDN.
By the end of this lesson, you'll understand:
Cloud Storage is a fully managed, scalable object storage service that:
Cloud Storage offers different classes for different use cases:
| Class | Access Pattern | Cost | Best For |
|---|---|---|---|
| Standard | Frequent | Normal | Websites, mobile apps, analytics |
| Nearline | Infrequent (< once/month) | Lower | Backups, archival |
| Coldline | Rare (< once/quarter) | Very Low | Long-term archival |
| Archive | Minimal (< once/year) | Minimal | Compliance, long-term backup |
# Create a bucket (bucket names must be globally unique)
gsutil mb -p my-project -l us-central1 gs://my-app-bucket
# Create with specific storage class
gsutil mb -p my-project -l us-central1 -c STANDARD gs://my-app-bucket
# List all buckets
gsutil ls
# View bucket details
gsutil ls -L gs://my-app-bucket# Set default bucket for easier commands
gsutil config set GSUtil:default_project_id my-project
gsutil config set GSUtil:default_bucket_name my-app-bucket
# Now you can use shorter commands
gsutil ls # Lists default bucket contents# Upload a single file
gsutil cp index.html gs://my-app-bucket/
# Upload directory recursively
gsutil -m cp -r ./public/* gs://my-app-bucket/
# Upload with compression
gsutil -h "Content-Encoding:gzip" cp compressed.tar.gz gs://my-app-bucket/
# Upload with metadata
gsutil -h "Cache-Control:public, max-age=3600" cp image.jpg gs://my-app-bucket/# Download a file
gsutil cp gs://my-app-bucket/index.html .
# Download entire bucket
gsutil -m cp -r gs://my-app-bucket/* ./local-copy/
# Download specific file type
gsutil -m cp gs://my-app-bucket/*.jpg ./images/# List files in bucket
gsutil ls gs://my-app-bucket/
# List files recursively
gsutil ls -r gs://my-app-bucket/
# Get file information
gsutil stat gs://my-app-bucket/index.html
# Delete a file
gsutil rm gs://my-app-bucket/old-file.html
# Delete entire bucket (must be empty)
gsutil rm -r gs://my-app-bucket/
# Copy between buckets
gsutil cp gs://source-bucket/file.txt gs://dest-bucket/
# Move/rename files
gsutil mv gs://my-bucket/old-name.txt gs://my-bucket/new-name.txt# Make file publicly readable
gsutil acl ch -u AllUsers:R gs://my-app-bucket/public.html
# Remove public access
gsutil acl ch -d AllUsers gs://my-app-bucket/public.html
# Make entire bucket public (not recommended for sensitive data)
gsutil iam ch allUsers:objectViewer gs://my-app-bucket
# Remove all public access from bucket
gsutil iam ch -d allUsers:objectViewer gs://my-app-bucketGenerate temporary download links without making files public:
# Create a signed URL valid for 1 hour
gsutil signurl -d 1h /path/to/service-account-key.json \
gs://my-app-bucket/private-file.pdf
# Signed URL valid for 7 days
gsutil signurl -d 7d /path/to/service-account-key.json \
gs://my-app-bucket/private-file.pdf# Grant user read-only access
gsutil iam ch user:colleague@example.com:objectViewer gs://my-app-bucket
# Grant service account write access
gsutil iam ch serviceAccount:sa@my-project.iam.gserviceaccount.com:objectEditor gs://my-app-bucket
# View all permissions on bucket
gsutil iam get gs://my-app-bucket# Create bucket for website
gsutil mb -p my-project -l us-central1 gs://my-website.com
# Make bucket public (so files can be viewed)
gsutil iam ch allUsers:objectViewer gs://my-website.com
# Configure as website (set index.html and 404 page)
gsutil web set -m index.html -e 404.html gs://my-website.com
# Verify configuration
gsutil web get gs://my-website.com# Structure your files
# public/
# ├── index.html
# ├── 404.html
# ├── styles.css
# └── images/
# └── logo.png
# Upload all files
gsutil -m cp -r public/* gs://my-website.com/
# Set appropriate cache headers
gsutil -h "Cache-Control:public, max-age=3600" -m cp -r public/*.html gs://my-website.com/
gsutil -h "Cache-Control:public, max-age=31536000" -m cp -r public/images/* gs://my-website.com/images/public/index.html:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>My Website</title>
<link rel="stylesheet" href="styles.css">
</head>
<body>
<h1>Welcome to My Website</h1>
<p>Hosted on Google Cloud Storage</p>
<img src="images/logo.png" alt="Logo">
</body>
</html>public/404.html:
<!DOCTYPE html>
<html lang="en">
<head>
<title>404 - Page Not Found</title>
</head>
<body>
<h1>404 - Page Not Found</h1>
<p><a href="/">Go back home</a></p>
</body>
</html># Add custom domain to bucket (requires DNS setup)
# 1. Verify domain ownership in Google Cloud Console
# 2. Create DNS CNAME record pointing to c.storage.googleapis.com
# 3. Load balancer configuration to route domain to bucket
# For Firebase Hosting (easier custom domain setup):
firebase hosting:site:list
firebase hosting:domain:create my-website.com# Create a backend bucket for Cloud Storage
gcloud compute backend-buckets create my-cdn-backend \
--gcs-bucket-name=my-app-bucket \
--enable-cdn
# Create URL map for routing
gcloud compute url-maps create my-cdn-urlmap \
--default-service=my-cdn-backend
# Create HTTPS proxy
gcloud compute target-https-proxies create my-cdn-proxy \
--url-map=my-cdn-urlmap \
--ssl-certificate=my-ssl-cert
# Create forwarding rule
gcloud compute forwarding-rules create my-cdn-rule \
--global \
--target-https-proxy=my-cdn-proxy \
--address=my-static-ip \
--ports=443# Set cache policy on backend bucket
gcloud compute backend-buckets update my-cdn-backend \
--cache-mode=CACHE_ALL_STATIC \
--default-ttl=3600 \
--max-ttl=86400
# Set client-facing cache headers in your files
gsutil -h "Cache-Control:public, max-age=3600" cp index.html gs://my-app-bucket/// server.js - Upload file to Cloud Storage
const {Storage} = require('@google-cloud/storage');
const express = require('express');
const multer = require('multer');
const app = express();
const storage = new Storage({projectId: 'my-project'});
const bucket = storage.bucket('my-app-bucket');
// Configure multer for file uploads
const upload = multer({ storage: multer.memoryStorage() });
// Upload endpoint
app.post('/upload', upload.single('file'), async (req, res) => {
try {
const file = bucket.file(req.file.originalname);
await file.save(req.file.buffer, {
metadata: {
contentType: req.file.mimetype
}
});
res.json({
success: true,
message: 'File uploaded',
url: `https://storage.googleapis.com/my-app-bucket/${req.file.originalname}`
});
} catch (error) {
console.error('Upload error:', error);
res.status(500).json({ error: 'Upload failed' });
}
});
// Download endpoint
app.get('/download/:filename', async (req, res) => {
try {
const file = bucket.file(req.params.filename);
const stream = file.createReadStream();
stream.on('error', () => {
res.status(404).json({ error: 'File not found' });
});
res.setHeader('Content-Disposition', `attachment; filename=${req.params.filename}`);
stream.pipe(res);
} catch (error) {
res.status(500).json({ error: 'Download failed' });
}
});
app.listen(3000, () => console.log('Server running on port 3000'));# app.py - Upload file to Cloud Storage
from flask import Flask, request, jsonify
from google.cloud import storage
import os
app = Flask(__name__)
storage_client = storage.Client(project='my-project')
bucket = storage_client.bucket('my-app-bucket')
@app.route('/upload', methods=['POST'])
def upload_file():
try:
file = request.files['file']
blob = bucket.blob(file.filename)
blob.upload_from_string(
file.read(),
content_type=file.content_type
)
return jsonify({
'success': True,
'url': f'https://storage.googleapis.com/my-app-bucket/{file.filename}'
})
except Exception as e:
return jsonify({'error': str(e)}), 500
@app.route('/download/<filename>', methods=['GET'])
def download_file(filename):
try:
blob = bucket.blob(filename)
file_content = blob.download_as_bytes()
return file_content, 200, {
'Content-Disposition': f'attachment; filename={filename}',
'Content-Type': blob.content_type
}
except Exception as e:
return jsonify({'error': 'File not found'}), 404
if __name__ == '__main__':
app.run(debug=False, port=3000)# Hot data → Standard (frequent access)
gsutil -D cp -r ./website/* gs://hot-bucket/
# Warm data → Nearline (monthly access)
gsutil -D cp -r ./backups/* gs://backup-bucket/
# Cold data → Archive (yearly access)
gsutil -D cp -r ./archives/* gs://archive-bucket/
# Transition to cheaper class after 90 days
gsutil lifecycle set - gs://my-bucket << 'EOF'
{
"lifecycle": {
"rule": [
{
"action": {"type": "SetStorageClass", "storageClass": "NEARLINE"},
"condition": {"age": 90}
}
]
}
}
EOF# Use consistent naming
# Format: gs://bucket-name/project/resource-type/resource-id/file.ext
gsutil cp logo.png gs://my-app-bucket/assets/images/
gsutil cp backup.sql gs://my-app-bucket/backups/databases/
gsutil cp report.pdf gs://my-app-bucket/documents/reports/# Enable object versioning
gsutil versioning set on gs://my-app-bucket
# List all versions of a file
gsutil ls -a gs://my-app-bucket/index.html
# Restore previous version
gsutil cp gs://my-app-bucket/index.html#1234567890 gs://my-app-bucket/index.html# Configure lifecycle policy to delete files after 30 days
gsutil lifecycle set - gs://my-app-bucket << 'EOF'
{
"lifecycle": {
"rule": [
{
"action": {"type": "Delete"},
"condition": {"age": 30}
}
]
}
}
EOFStorage costs vary by:
Tips:
Explore Firebase Hosting for easier website deployment with custom domains, or learn advanced storage patterns with versioning and data transfers.
Resources
Ojasa Mirai
Master AI-powered development skills through structured learning, real projects, and verified credentials. Whether you're upskilling your team or launching your career, we deliver the skills companies actually need.
Learn Deep • Build Real • Verify Skills • Launch Forward