--- title: World Geography emoji: 💬 colorFrom: yellow colorTo: purple sdk: gradio sdk_version: 5.42.0 app_file: app.py pinned: false hf_oauth: true hf_oauth_scopes: - inference-api license: apache-2.0 short_description: Learning Country Names, locations, flags, principal location --- # World Geography Game 🌍 An interactive geography game where you try to guess the country I'm thinking of using 20 yes/no questions. Built with [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [Hugging Face Inference API](https://huggingface.co/docs/api-inference/index). ## Architecture The application follows a modular architecture with clear separation between the UI, game logic, and AI inference: ![Architecture diagram showing the modular design of the World Geography Game with Application Layer, Data Layer, External Services, and User Interface Layer](https://huggingface.co/spaces/jbejar86/World-Geography/resolve/main/arch.png) ### Key Components: - **Gradio ChatInterface**: Provides the web-based chat UI - **Game Logic**: Manages the 20-questions game flow and state - **Country Selector**: Randomly selects countries and fetches facts - **Response Cleaner**: Processes and formats AI model responses - **Dual Mode Support**: Seamlessly switches between local and cloud inference - **Facts Fetcher**: Enriches game data with real country information ## Features - 🎯 20 Questions gameplay format - 🌍 Covers countries from around the world - 🤖 AI-powered responses using Llama models - 🏠 Local model support for development - ☁️ Cloud deployment with HuggingFace OAuth ## Running Locally To run this application with a local inference server (like LM Studio, Ollama, etc.): 1. **Create a `.env` file** from the sample: ```bash cp sample.env .env ``` 2. **Configure your local model settings** in `.env`: ```env MODEL_NAME=llama-3.2-3b-instruct BASE_URL=http://127.0.0.1:1234/v1 TOKEN=abc123 ``` 3. **Install dependencies**: ```bash pip install -r requirements.txt ``` 4. **Run the application**: ```bash python app.py ``` When running locally, the app will automatically detect the environment variables and use your local model instead of requiring HuggingFace OAuth login. ## Cloud Deployment When deployed to HuggingFace Spaces or running without local environment variables, the app will use HuggingFace's Inference API and require users to log in with their HuggingFace account. ## How to Play 1. Think of questions that can be answered with "Yes" or "No" 2. Try to narrow down the location, language, geography, or other characteristics 3. You have 20 questions to guess the correct country 4. The AI will keep track of your question count and let you know when you've won or used all your questions Good luck! 🎮