fusero-app-boilerplate/README.md
2025-05-15 18:55:18 +02:00

269 lines
8.0 KiB
Markdown

# Fusero App Boilerplate
A full-stack application boilerplate with React frontend and Node.js backend.
## Project Structure
```
fusero-app-boilerplate/
├── frontend/ # React frontend application
├── backend/ # Node.js backend application
├── docker-compose.yml # Production Docker configuration
└── docker-compose.dev.yml # Development Docker configuration
```
## Prerequisites
- Node.js (v20 or higher)
- npm (v9 or higher)
- Docker and Docker Compose
- Git
## Development Setup
### Important Note: Database Must Run in Docker
The PostgreSQL database must always run in Docker, regardless of your development setup choice. This ensures consistent database behavior across all environments.
To start the database:
```bash
docker-compose up db
```
### Running Services Separately (Recommended for Development)
For better debugging experience, run the frontend and backend in separate terminal windows, while keeping the database in Docker:
1. **First, ensure the database is running in Docker**
```bash
docker-compose up db
```
2. **Then, in separate terminal windows:**
#### Terminal 1: Backend Service
```bash
cd backend
npm install
npm run dev
```
The backend will be available at http://localhost:14000
#### Terminal 2: Frontend Service
```bash
cd frontend
npm install
npm run dev
```
The frontend will be available at http://localhost:3000
### Database Setup
1. **Create a New Volume**
- Ensure the database volume is created:
```bash
docker volume create fusero-db-data
```
2. **Run Migrations**
- Apply database migrations to set up the schema:
```bash
cd backend
npm run migrate
```
3. **Seed the Database**
- Populate the database with initial data:
```bash
cd backend
npm run seed
```
### Environment Setup
1. **Backend Environment**
- Copy `.env.example` to `.env` in the backend directory
- Configure your environment variables:
```
PORT=14000
DB_HOST=localhost
DB_PORT=19090
DB_USER=postgres
DB_PASSWORD=postgres
DB_NAME=fusero
JWT_SECRET=your_jwt_secret_key_here
```
2. **Frontend Environment**
- Copy `.env.example` to `.env` in the frontend directory
- Set the API base URL:
```
VITE_API_BASE_URL=http://localhost:14000/api/v1
```
## Production Deployment
1. **Build and Run with Docker**
```bash
docker-compose up --build
```
2. **Run Migrations and Seeders in Production**
After your containers are up, run the following commands to apply database migrations and seed data inside the backend container:
```bash
docker exec -it fusero-app-backend npx mikro-orm migration:up
docker exec -it fusero-app-backend npm run seed
```
**Note:** These commands must be run inside the backend container so they use the correct Docker network and environment variables.
3. **Environment Variables**
- Ensure all environment variables are properly set in your production environment
- Never commit `.env` files to version control
## Frontend Routing in Production
In production, the frontend is served through nginx. To ensure client-side routing works correctly:
1. **Nginx Configuration**
- Ensure your nginx configuration includes the following directive to handle unknown routes:
```nginx
location / {
try_files $uri $uri/ /index.html;
}
```
2. **React Router Configuration**
- Set the `basename` dynamically based on the environment:
- In production, set `basename="/dashboard"`.
- In development, set `basename="/"`.
3. **Navigation Links**
- Use relative paths in your navigation links (e.g., `to="canvas/canvas-endpoints"` instead of `to="/dashboard/canvas/canvas-endpoints"`).
## HTTPS with Self-Signed Certificates
To run the application with HTTPS using a self-signed certificate:
1. **Generate a Self-Signed Certificate**
```bash
openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout ./nginx/ssl/nginx.key -out ./nginx/ssl/nginx.crt
```
2. **Update Docker Compose**
- Ensure your `docker-compose.yml` mounts the certificate files in the nginx service:
```yaml
volumes:
- ./nginx/ssl:/etc/nginx/ssl
```
3. **Nginx Configuration**
- Use the production nginx configuration that includes SSL settings.
## Development Best Practices
1. **Database Management**
- Always run the database in Docker
- Use `docker-compose.dev.yml` for development
- Never run PostgreSQL directly on your host machine
2. **Running Services Separately**
- For development, it's recommended to run frontend and backend in separate terminal windows
- This allows for better debugging and hot-reloading
- You can see logs from each service clearly
3. **Code Organization**
- Frontend code should be in the `frontend/` directory
- Backend code should be in the `backend/` directory
- Shared types and utilities should be in their respective directories
4. **Version Control**
- Commit `package-lock.json` files
- Don't commit `.env` files
- Use meaningful commit messages
## API Documentation
The backend API is documented using Swagger/OpenAPI. After starting the backend service, you can access the API documentation at:
- Development: http://localhost:14000/api-docs
- Production: http://your-domain/api-docs
## Troubleshooting
1. **Port Conflicts**
- If you encounter port conflicts, check which services are running:
```bash
docker ps
```
- Or check for processes using the ports:
```bash
lsof -i :3000
lsof -i :14000
```
2. **Database Issues**
- Ensure PostgreSQL is running in Docker
- Check database connection settings in `.env`
- Verify database migrations are up to date
- If database issues persist, try:
```bash
docker-compose -f docker-compose.dev.yml down
docker-compose -f docker-compose.dev.yml up db
```
3. **CORS Issues**
- If you see CORS errors, verify the frontend's API base URL
- Check backend CORS configuration
- Ensure both services are running on the correct ports
## Contributing
1. Create a new branch for your feature
2. Make your changes
3. Submit a pull request
4. Ensure all tests pass
5. Update documentation as needed
## License
This project is licensed under the MIT License - see the LICENSE file for details.
## Technical Documentation: ChatGPT-Powered Endpoint Creation
### Overview
Developers can leverage the ChatGPT modal in the Canvas Endpoints UI to create new Canvas API endpoints using natural language prompts. When a user enters a prompt like "Create a course endpoint for Canvas", the system uses ChatGPT to:
1. Interpret the intent and generate a JSON object with the required fields for the endpoint (name, method, path, description, etc.).
2. Automatically submit this JSON to the backend endpoint creation API (`/api/v1/canvas-api/endpoints`).
3. Refresh the endpoint list in the UI and display a success message.
### How it Works
- **Prompt Handling:**
- The frontend sends the user's prompt to `/api/v1/canvas-api/chatgpt/completions`.
- ChatGPT is instructed to return only a JSON object suitable for the endpoint creation form.
- **Auto-Creation:**
- If the response is a valid endpoint JSON (with `name`, `method`, and `path`), the frontend posts it to `/api/v1/canvas-api/endpoints`.
- The endpoint list is refreshed and a toast notification is shown.
- **Fallback:**
- If the response is not a valid endpoint JSON, it is displayed as a normal chat message.
### Example Prompt
```
Create a course endpoint for Canvas. Use the Canvas API docs to determine the correct path and required fields.
```
### Example ChatGPT Response
```
{
"name": "Create Course",
"method": "POST",
"path": "/courses",
"description": "Creates a new course in Canvas."
}
```
### Developer Notes
- The ChatGPT modal logic is in `frontend/src/components/CanvasEndpoints.tsx`.
- The backend endpoint creation API is `/api/v1/canvas-api/endpoints`.
- The system expects ChatGPT to return a JSON object with at least `name`, `method`, and `path`.
- The endpoint list is auto-refreshed after creation.
---