This repository contains the configuration for a server environment that integrates JupyterHub, NGINX, and Ollama using Docker.
The Lab setup provides a flexible and containerized environment for running multi-user Jupyter notebooks with secure proxying via NGINX and AI model serving through Ollama.
- JupyterHub – Manages multiple Jupyter notebook servers for users.
- NGINX – Acts as a reverse proxy for handling HTTPS, load balancing, and routing.
- Ollama – Provides access to locally hosted LLMs for AI-powered notebooks.
lab/
├── docker-compose.yml # Main Docker Compose configuration
├── jupyterhub/
│ └── jupyterhub_config.py
├── nginx/
│ └── default.conf
└── Dockerfile
- Docker and Docker Compose installed
- Adequate hardware for Ollama model inference (GPU recommended)
-
Clone the repository:
git clone https://github.com/<your-username>/Lab.git cd Lab
-
Start all services:
docker compose up -d
-
Access the services:
- JupyterHub: http://localhost:8000
- NGINX (Proxy): http://localhost
- Ollama API: http://localhost:11434
docker compose down- Modify
nginx/default.conffor custom routes or SSL setup. - Update
jupyterhub_config.pyto change authentication or spawner settings.
This project is licensed under the MIT License.