This repository contains a complete e-commerce microservices backend built from scratch in Python. It demonstrates a modern, decoupled architecture using gRPC for internal communication, Kafka for event-driven data flow, and a GraphQL gateway as the public-facing API.
The entire backend infrastructure is operational. The project is now ready for a frontend application (React, Vue, Mobile) to be connected.
- Core Infrastructure: Docker Compose setup with PostgreSQL (x4), Kafka, Zookeeper, and Elasticsearch.
- Account Service: gRPC service for secure user registration and login (JWT).
- Product Service: FastAPI REST service for product management, Elasticsearch indexing, and Kafka event publishing.
- Recommender Service: Hybrid service that consumes Kafka events to build a local dataset and serves recommendations via gRPC.
- Order Service: gRPC service for order management; acts as an HTTP client to fetch product prices and a Kafka producer for order events.
- Payment Service: Complex hybrid service handling transactions, consuming product events, and processing webhooks to update order status via gRPC.
- GraphQL Gateway: A unified API Gateway (Strawberry) that federates requests to all backend microservices.
- Frontend UI: Build a React/Next.js dashboard to visualize products and orders.
- Authentication UI: Login/Register forms consuming the GraphQL mutations.
- Real-time Updates: Use GraphQL Subscriptions for real-time order status updates.
The system uses a Gateway Pattern. Clients communicate only with the GraphQL Gateway, which routes requests to backend services using gRPC or HTTP. Asynchronous tasks (like recommendations or data syncing) are handled via Kafka events.
| Service | Type | Port | Database | Responsibility |
|---|---|---|---|---|
| Gateway | GraphQL | 8080 |
- | Unified entry point; aggregates data from all services. |
| Account | gRPC | 50051 |
Postgres | User auth & management. |
| Product | HTTP | 8002 |
Elastic | Product catalog & search. |
| Recommender | gRPC | 50052 |
Postgres | Product recommendations (ML). |
| Order | gRPC | 50053 |
Postgres | Order processing & history. |
| Payment | gRPC/HTTP | 50054 / 8003 |
Postgres | Payments & Webhook processing. |
- Language: Python 3.10+
- Frameworks:
- FastAPI: High-performance web framework for REST and GraphQL.
- gRPC (
grpcio): Internal service-to-service communication. - Strawberry: GraphQL library for Python.
- Data:
- PostgreSQL: Relational data (SQLAlchemy ORM).
- Elasticsearch: Full-text search engine.
- Messaging:
- Apache Kafka: Event streaming platform (
kafka-python).
- Apache Kafka: Event streaming platform (
- Infrastructure:
- Docker & Docker Compose: Container orchestration.
To simplify development, a Makefile is included to automate the setup, generation of gRPC code, and running of services.
- Python 3.10+
- Docker & Docker Compose
pip
| Command | Description |
|---|---|
make infra |
Starts the Docker infrastructure (Postgres, Kafka, Elasticsearch, Zookeeper). |
make install |
Iterates through all service folders, creates virtual environments, and installs dependencies from requirements.txt. |
make protos |
Compiles .proto files into Python gRPC code for all services. |
make db |
Runs the database initialization scripts for Account, Order, Payment, and Recommender services. |
make run |
Uses honcho to run all 8 services (servers + consumers) + the Frontend in a single terminal window. |
If you are setting this up on a fresh machine, run these commands in order:
-
Start Infrastructure:
make infra
-
Setup Project:
```bash make install make protos make db ```(Wait a moment for databases to initialize)
-
Run Everything:
make run
All backend logs will appear in this terminal with different colors. Press
Ctrl+Cto stop everything.
Run the following to start all databases (PostgreSQL, Elasticsearch) and the message broker (Kafka/Zookeeper).
docker-compose up -d
### 2\. Setup Services
Open separate terminals for each service. For the first run, install dependencies and setup the database in each folder:
```bash
# Example for one service (repeat for account, product, etc.)
cd service_name
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
# Run DB setup script (if applicable)
python database.py
You will need 8 terminals to run the full stack (servers + consumers).
- Account Service:
cd account && python main.py - Product Service:
cd product && uvicorn main:app --port 8002 --reload - Recommender Consumer:
cd recommender && python app/entry/sync.py - Recommender Server:
cd recommender && python app/entry/main.py - Order Service:
cd order && python main.py - Payment Consumer:
cd payment && python consumer.py - Payment Server:
cd payment && python main.py - GraphQL Gateway:
cd graphql && uvicorn main:app --port 8080 --reload
Once everything is running, open your browser to: 👉 http://localhost:8080/graphql
You can use the interactive Playground to test the entire flow:
1. Create a User
mutation {
register(name: "Test User", email: "[email protected]", password: "password123") {
token
}
}2. Get Recommendations
query {
recommendations(userId: "1") {
id
name
price
}
}3. Create an Order
mutation {
createOrder(
accountId: 1
products: [{ productId: "YOUR_PRODUCT_ID", quantity: 1 }]
) {
id
totalPrice
status
}
}4. Pay for Order
mutation {
checkout(
userId: 1
orderId: 1
email: "[email protected]"
products: [{ productId: "YOUR_PRODUCT_ID", quantity: 1 }]
) {
checkoutUrl
}
}