Skip to content

amarrerod/Lab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lab

This repository contains the configuration for a server environment that integrates JupyterHub, NGINX, and Ollama using Docker.

Overview

The Lab setup provides a flexible and containerized environment for running multi-user Jupyter notebooks with secure proxying via NGINX and AI model serving through Ollama.

Components

  • JupyterHub – Manages multiple Jupyter notebook servers for users.
  • NGINX – Acts as a reverse proxy for handling HTTPS, load balancing, and routing.
  • Ollama – Provides access to locally hosted LLMs for AI-powered notebooks.

Structure


lab/
├── docker-compose.yml     # Main Docker Compose configuration
├── jupyterhub/
│   └── jupyterhub_config.py
├── nginx/
│   └── default.conf
└── Dockerfile

Getting Started

Prerequisites

  • Docker and Docker Compose installed
  • Adequate hardware for Ollama model inference (GPU recommended)

Setup

  1. Clone the repository:

      git clone https://github.com/<your-username>/Lab.git
      cd Lab
  2. Start all services:

      docker compose up -d
  3. Access the services:

Stopping Services

  docker compose down

Customization

  • Modify nginx/default.conf for custom routes or SSL setup.
  • Update jupyterhub_config.py to change authentication or spawner settings.

License

This project is licensed under the MIT License.

About

Configuration of the tools we use in our lab.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published