localgcp
Run Google Cloud locally. One Go binary, fourteen services, zero cloud bills. Nine native services start in under 100ms. Five more via Docker, on demand.
Fourteen services. One command.
Your GCP client libraries already support emulator host env vars. Point them at localhost and your existing code works with zero changes.
generateContent and embeddings via Ollama. Proxy to Gemma, Llama, or any local model. Stub mode for CI/CD.
Bucket and object CRUD. Simple, multipart, and resumable uploads. Signed URLs. JSON and XML API paths.
Topics, subscriptions, publish, pull, StreamingPull. Push subscriptions and dead letter topics.
Document CRUD, real-time listeners (onSnapshot), queries with in/array-contains. Transactions.
Secrets with versioning. Enable, disable, destroy states. Access by version number or "latest" alias.
Queue and task CRUD. HTTP target dispatch. Scheduling, retry with exponential backoff.
Encrypt/decrypt, asymmetric sign, HMAC. KeyRing and CryptoKey management. In-memory keys.
Write and query log entries. Filter by severity, text payload, log name. Bounded in-memory store.
Service CRUD with immediate operations. Auto-generated URIs. No polling required.
Official Google emulator via Docker. Lazy start on first connection. Full API.
Official Google emulator via Docker. Lazy start on first connection. Full API.
PostgreSQL 16 via Docker. Standard connection. Works with any Postgres client.
Redis 7 via Docker. Standard connection. Works with any Redis client.
DuckDB-powered BigQuery emulator. Same REST API, same SQL. Works with bq CLI, Python SDK, Go SDK.
IAM, Datastore, and more on the roadmap. See roadmap
Three commands to local GCP
Install
Single binary. No Docker, no JVM, no runtime dependencies.
$ brew install slokam-ai/tap/localgcp
# or: go install github.com/slokam-ai/localgcp/cmd/localgcp@latest
Start
All native services in the foreground. For Vertex AI with local models, start Ollama first.
$ ollama pull gemma3 # optional: for Vertex AI
$ localgcp up --vertex-model-map="gemini-2.5-flash=gemma3"
Connect
Sets emulator host env vars. Your GCP client libraries do the rest.
$ eval $(localgcp env)
$ go run ./your-app
Before and after
Without localgcp
- Spin up real GCP dev projects
- Pay cloud bills for test environments
- Burn Vertex AI credits on every prompt test
- Manage individual emulators per service
- No Cloud Storage or Cloud Tasks emulator from Google
- Leak API keys in CI/CD for AI tests
- Requires internet connection
With localgcp
- One binary, starts in milliseconds
- Zero cloud bills for dev and CI
- Run Vertex AI against local Gemma/Llama
- All services in one process
- Every service included, even the missing ones
- No API keys needed, ever
- Works fully offline
Frequently asked questions
The short answers to the questions developers ask when they search for a way to run Google Cloud locally.
Is there a LocalStack for Google Cloud?
Yes. localgcp is the LocalStack equivalent for Google Cloud Platform. It is a single open-source Go binary that emulates fourteen GCP services on localhost — Vertex AI, BigQuery, Spanner, Firestore, Pub/Sub, Cloud Storage, Bigtable, Cloud SQL, Memorystore, Cloud Tasks, Cloud KMS, Secret Manager, Cloud Run, and Cloud Logging. Your existing GCP client libraries work unchanged — point them at localhost via the standard emulator host environment variables.
How do I run Google Cloud locally?
Install with brew install slokam-ai/tap/localgcp, start the emulator with
localgcp up, then run eval $(localgcp env) to export the
standard GCP emulator host variables. You can also install from source with
go install github.com/slokam-ai/localgcp/cmd/localgcp@latest or run the
Docker image at ghcr.io/slokam-ai/localgcp.
Does Google Cloud provide a local emulator?
Google ships fragmented emulators for a subset of services (Pub/Sub, Firestore, Spanner, Bigtable, Datastore) but has no unified local environment, no Cloud Storage emulator, no Vertex AI emulator, and no Cloud Tasks emulator. localgcp fills this gap with one binary that covers fourteen services — including the ones Google does not.
Can I run Vertex AI locally without an API key?
Yes. The Vertex AI emulator in localgcp proxies generateContent and
embeddings calls to Ollama-hosted local models (Gemma, Llama, or any model you pull).
The official google.golang.org/genai client works unchanged — just set
HTTPOptions.BaseURL to http://localhost:8090. Without Ollama
running, localgcp returns deterministic stub responses suitable for CI/CD.
Is there a BigQuery emulator?
Yes. localgcp bundles a DuckDB-powered BigQuery emulator (LocalBQ)
that exposes the BigQuery REST API on port 9060. It works with the bq CLI,
the Python SDK, and the Go SDK. It runs via Docker orchestration to isolate native
DuckDB dependencies from the main binary.
How does localgcp compare to LocalStack?
LocalStack emulates AWS. localgcp emulates GCP. Both let you develop and test cloud applications without hitting real cloud APIs or paying cloud bills. localgcp is MIT licensed, ships as a single Go binary (no Python, no Docker required for native services), and starts in under 100ms.
Is localgcp free and open source?
Yes. MIT licensed, fully open source at github.com/slokam-ai/localgcp. No paid tier, no cloud lock-in, no telemetry.
Which languages and SDKs work with localgcp?
Any official Google Cloud client library that honors the standard emulator host environment variables — Go, Python, Java, Node.js, Ruby, .NET, PHP. For services without an env var convention (Secret Manager, KMS, Cloud Run, Vertex AI, Cloud Tasks, Cloud Logging), set the endpoint manually on the client.
Start building locally
Fourteen services. 170+ tests. MIT licensed. Works with Go, Python, Java, Node.js.
View on GitHub or: read the docs · download a binary