Blog


    Running Ollama on Google Colab Through Pinggy


    Ollama Google Colab Pinggy AI Deployment LLM Hosting OpenWebUI Python SDK
    Running large language models locally can be expensive and resource-intensive. If you’re tired of paying premium prices for GPU access or dealing with complex local setups, there’s a better way. Google Colab provides free GPU resources, and when combined with Pinggy's tunneling service, you can run Ollama models accessible from anywhere on the internet. This comprehensive guide will show you exactly how to set up Ollama on Google Colab and use Pinggy’s Python SDK to create secure tunnels that make your models accessible through public URLs.

    Forward Ollama Port 11434 for Online Access: Complete Guide


    Ollama port forwarding Tunneling AI API Remote Access LLM Hosting
    Running AI models locally with Ollama gives you complete control over your data and inference, but what happens when you need to access these models remotely? Whether you’re working from different locations, collaborating with team members, or integrating AI into web applications, forwarding Ollama’s default port 11434 is the key to unlocking remote access to your local AI models. This comprehensive guide will show you exactly how to forward Ollama’s port 11434 to make your local AI models accessible online using secure tunneling.

    Self-hosting Obsidian


    obsidian self-hosted Docker couchdb livesync Pinggy
    I’ve been using Obsidian as my main note-taking tool for over two years, but didn’t want to pay $5/month for Obsidian Sync when I could build something better. After some research, I found the perfect setup: Docker for containerization, CouchDB for real-time sync, and Pinggy for secure remote access. It costs almost nothing, gives me full control of my data, and works flawlessly across all devices. The best part is the Obsidian LiveSync plugin, which provides faster, more reliable sync than the official service.

    What is 127.0.0.1 and Loopback?


    networking localhost 127.0.0.1 loopback development
    If you’ve ever typed localhost in your browser or seen 127.0.0.1 in configuration files, you’ve encountered one of networking’s most fundamental concepts: the loopback address. This special IP address is your computer’s way of talking to itself, and understanding it is crucial for anyone doing development work. Summary What is 127.0.0.1? The address 127.0.0.1 is the standard IPv4 loopback address that always points to your own computer. It’s the IP address behind “localhost” and enables local network communication without ever leaving your machine.

    How to Self-Host Any LLM – Step by Step Guide


    Self-Hosted AI Ollama Open WebUI Docker LLM Deployment AI Privacy
    Self-hosting large language models has become increasingly popular as developers and organizations seek greater control over their AI infrastructure. Running models like Llama 3, Mistral, or Gemma on your own hardware gives you complete privacy, eliminates API costs, and lets you customize everything to your exact needs. The best part is that modern tools make this process surprisingly straightforward, even if you’re not a DevOps expert. This comprehensive guide will walk you through setting up your own LLM hosting environment using Ollama and Open WebUI with Docker.

    USA, Europe, or China - Who has the best AI Models?


    LLM comparison AI models 2025 GPT-5 Claude 4 Gemini 2.5 Qwen3 DeepSeek AI benchmark global AI race
    The AI world in 2025 looks completely different from just two years ago. What started as an American-dominated field has evolved into a genuine three-way competition between the United States, China, and Europe. Each region has developed its own approach to AI, and honestly, it’s made the whole space way more interesting. The US still leads in breakthrough research and commercial applications, but China has been moving fast with cost-effective models that perform surprisingly well.

    Best Free & Open-Source AI Image Generators to Self-Host


    AI image generation self-hosted open-source Stable Diffusion FLUX.1 machine learning
    Tired of paying monthly subscriptions for AI image generation or dealing with usage limits on cloud-based services? Self-hosting your own AI image generator might be exactly what you need. The open-source community has delivered some incredible tools that can run on your own hardware, giving you complete control over your creative workflow without the recurring costs. Whether you’re a developer building the next great creative app, an artist looking for unlimited creative freedom, or just someone who wants to experiment without monthly fees, these open-source tools deliver professional-quality results.

    Best AI Tools for Coding in 2025


    AI coding tools GitHub Copilot Cursor development programming
    The landscape of AI-powered coding tools has evolved dramatically in 2025, transforming how developers write, debug, and maintain code. From intelligent autocomplete suggestions to full codebase analysis and refactoring, AI coding assistants have become indispensable tools for modern software development. Whether you’re a solo developer looking to boost productivity or part of a team seeking to streamline your development workflow, choosing the right AI coding tool can significantly impact your coding efficiency and code quality.

    What port does 'ping' work on?


    networking ICMP ping protocols troubleshooting
    ping doesn’t use a TCP or UDP port at all. Zero. None. Nada. That’s because ping doesn’t even use TCP or UDP in the first place - it works on a completely different protocol layer: ICMP (Internet Control Message Protocol). You could sit there with Wireshark running and watch packets fly by all day, but you won’t find a dst port=80 or dst port=443 in a real ICMP ping. If you’re used to troubleshooting with tools like telnet or nc that always need a port number, this can feel super weird.

    UDP vs TCP: Complete Guide to Network Protocols in 2025


    networking protocols tcp udp guide
    When building network applications, developers face a fundamental choice between two core internet protocols: TCP and UDP. This decision can make or break your application’s performance, affecting everything from user experience to system reliability. Understanding the differences between UDP and TCP isn’t just academic knowledge - it’s practical wisdom that determines whether your online game feels responsive, your video call stays smooth, or your financial transaction completes successfully. Summary TCP (Transmission Control Protocol)