Total Pageviews

Sunday, 9 July 2023

数据库程序Supabase

 

The open source Firebase alternative. Follow to stay updated about our public Beta.

Supabase gives you a dedicated Postgres database to build your web, mobile, and AI applications.

supabase.com,https://supabase.com/

Supabase is an open source Firebase alternative. We're building the features of Firebase using enterprise-grade open source tools.

  • Hosted Postgres Database. Docs
  • Authentication and Authorization. Docs
  • Auto-generated APIs.
  • Functions.
    • Database Functions. Docs
    • Edge Functions Docs
  • File Storage. Docs
  • AI + Vector/Embeddings Toolkit. Docs
  • Dashboard

Documentation

For full documentation, visit supabase.com/docs

To see how to Contribute, visit Getting Started

Community & Support

  • Community Forum. Best for: help with building, discussion about database best practices.
  • GitHub Issues. Best for: bugs and errors you encounter using Supabase.
  • Email Support. Best for: problems with your database or infrastructure.
  • Discord. Best for: sharing your applications and hanging out with the community.

Status

  • Alpha: We are testing Supabase with a closed set of customers
  • Public Alpha: Anyone can sign up over at supabase.com/dashboard. But go easy on us, there are a few kinks
  • Public Beta: Stable enough for most non-enterprise use-cases
  • Public: General Availability [status]

We are currently in Public Beta. Watch "releases" of this repo to get notified of major updates.

Watch this repo


How it works

Supabase is a combination of open source tools. We’re building the features of Firebase using enterprise-grade, open source products. If the tools and communities exist, with an MIT, Apache 2, or equivalent open license, we will use and support that tool. If the tool doesn't exist, we build and open source it ourselves. Supabase is not a 1-to-1 mapping of Firebase. Our aim is to give developers a Firebase-like developer experience using open source tools.

Architecture

Supabase is a hosted platform. You can sign up and start using Supabase without installing anything. You can also self-host and develop locally.

Architecture

  • PostgreSQL is an object-relational database system with over 30 years of active development that has earned it a strong reputation for reliability, feature robustness, and performance.
  • Realtime is an Elixir server that allows you to listen to PostgreSQL inserts, updates, and deletes using websockets. Realtime polls Postgres' built-in replication functionality for database changes, converts changes to JSON, then broadcasts the JSON over websockets to authorized clients.
  • PostgREST is a web server that turns your PostgreSQL database directly into a RESTful API
  • pg_graphql a PostgreSQL extension that exposes a GraphQL API
  • Storage provides a RESTful interface for managing Files stored in S3, using Postgres to manage permissions.
  • postgres-meta is a RESTful API for managing your Postgres, allowing you to fetch tables, add roles, and run queries, etc.
  • GoTrue is an JWT based API for managing users and issuing JWT tokens.
  • Kong is a cloud-native API gateway.

Client libraries

Our approach for client libraries is modular. Each sub-library is a standalone implementation for a single external system. This is one of the ways we support existing tools.

Language Client Feature-Clients (bundled in Supabase client)

Supabase PostgREST GoTrue Realtime Storage Functions
⚡️ Official ⚡️
JavaScript (TypeScript) supabase-js postgrest-js gotrue-js realtime-js storage-js functions-js
Flutter supabase-flutter postgrest-dart gotrue-dart realtime-dart storage-dart functions-dart
💚 Community 💚
C# supabase-csharp postgrest-csharp gotrue-csharp realtime-csharp storage-csharp functions-csharp
Go - postgrest-go gotrue-go - storage-go functions-go
Java - - gotrue-java - storage-java -
Kotlin supabase-kt postgrest-kt gotrue-kt realtime-kt storage-kt functions-kt
Python supabase-py postgrest-py gotrue-py realtime-py storage-py functions-py
Ruby supabase-rb postgrest-rb - - - -
Rust - postgrest-rs - - - -
Swift supabase-swift postgrest-swift gotrue-swift realtime-swift storage-swift functions-swift
Godot Engine (GDScript) supabase-gdscript postgrest-gdscript gotrue-gdscript realtime-gdscript storage-gdscript functions-gdscript

from https://github.com/supabase/supabase

----------

Github Action for Supabase Backups  .

Supa-Backup

Supa-Backup is a GitHub action that creates a backup of your Supabase database and stores it in your repository. With this action, you can easily automate the process of creating backups and ensure that your data is safe and secure. You can also copy our workflow example & deploy it directly to your repo.

Usage

To use Supa-Backup, you'll need to add the action to your GitHub workflow. Here's an example:

    steps:
      - name: Checkout
        uses: actions/checkout@v4
      - name: Supa-Backup
        uses: mansueli/supa-backup@v1.0.5
        with:
          supabase_url: 'postgresql://postgres:<pass>@db.<ref>.supabase.co:5432/postgres'
          file_prefix: 'test_' 

Warning

DO NOT run this on a public repo.

Usage

Workflow Example

Here's a workflow example that demonstrates how to use the Supa-Backup action & commit the backup to the repo:

name: Supa-Backup
on:
  workflow_dispatch:
  schedule:
    - cron: '0 0 * * *' # Runs every day at midnight
jobs:
  backup:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/checkout@v4
      - name: Supa-Backup
        uses: mansueli/supa-backup@v1.0.5
        with:
          supabase_url: 'postgresql://postgres:<pass>@db.<ref>.supabase.co:5432/postgres'
          file_prefix: 'test_'
      - uses: stefanzweifel/git-auto-commit-action@v4
        with:
          commit_message: Supabase backup

In this example, the Supa-Backup action is run every day at midnight. It runs on the latest version of Ubuntu and performs two steps: checking out your repository and running the Supa-Backup action.

Storage Backup Workflow (commits the backup to the repo)

name: SupaStorage-backup
on:
  workflow_dispatch:
  schedule:
    - cron: '0 */6 * * *'
jobs:
  backup:
    runs-on: ubuntu-latest
    env:
      SUPABASE_URL: https://project_ref.supabase.co
      SUPABASE_SERVICE_ROLE: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6InByb2plY3RfcmVmIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImlhdCI6MTY3MDg4MTY2MSwiZXhwIjoxOTg2NDU3NjYxfQ.fOAQAZMEXOhUh2CDBKKXYrjm_RhB6DlMFVRn8-u_SLA
    permissions:
      contents: write
    steps:
    - name: Checkout code
      uses: actions/checkout@v4
      with:
         ref: ${{ github.head_ref }}
    - name: Set up Python
      uses: actions/setup-python@v4
      with:
        python-version: '3.10'

    - name: Install dependencies
      run: |
        pip install supabase
        [[ -d supabase_storage_backup ]] || mkdir supabase_storage_backup
        cd supabase_storage_backup
        wget https://raw.githubusercontent.com/mansueli/Supa-Migrate/main/storage-backup.py
        chmod +x storage-backup.py
        python storage-backup.py
        rm storage-backup.py
      shell: bash
      
    - name: Set current date as env variable
      run: echo "NOW=$(date +'%Y-%m-%dT%H:%M:%S')" >> $GITHUB_ENV
        
    - uses: stefanzweifel/git-auto-commit-action@v4
      with:
        commit_message: Supabase Storage backup - ${{ env.NOW }}

 

from https://github.com/mansueli/Supa-Backup

------------------------------------------------------------

Automated backups using GitHub Actions

Backup your database

You can use the Supabase CLI to backup your Postgres database. The steps involve running a series of commands to dump roles, schema, and data separately.
Inside your repository, create a new file inside the .github/workflows folder called backup.yml. Copy the following snippet inside the file, and the action will run whenever a new PR is created.

Never backup your data to a public repository.
Backup action#
name: 'backup-database'
on:
  pull_request:
jobs:
  build:
    runs-on: ubuntu-latest
    env:
      supabase_db_url: ${{ secrets.SUPABASE_DB_URL }}   # For example: postgresql://postgres:[YOUR-PASSWORD]@db.<ref>.supabase.co:5432/postgres
    steps:
      - uses: actions/checkout@v2
      - uses: supabase/setup-cli@v1
        with:
          version: latest
      - name: Backup roles
        run: supabase db dump --db-url "$supabase_db_url" -f roles.sql --role-only
      - name: Backup schema
        run: supabase db dump --db-url "$supabase_db_url" -f schema.sql
      - name: Backup data
        run: supabase db dump --db-url "$supabase_db_url" -f data.sql --data-only --use-copy
Periodic Backups Workflow#

You can use the GitHub Action to run periodic backups of your database. In this example, the Action workflow is triggered by push and pull_request events on the main branch, manually via workflow_dispatch, and automatically at midnight every day due to the schedule event with a cron expression.
The workflow runs on the latest Ubuntu runner and requires write permissions to the repository's contents. It uses the Supabase CLI to dump the roles, schema, and data from your Supabase database, utilizing the SUPABASE_DB_URL environment variable that is securely stored in the GitHub secrets.
After the backup is complete, it auto-commits the changes to the repository using the git-auto-commit-action. This ensures that the latest backup is always available in your repository. The commit message for these automated commits is "Supabase backup".
This workflow provides an automated solution for maintaining regular backups of your Supabase database. It helps keep your data safe and enables easy restoration in case of any accidental data loss or corruption.

Never backup your data to a public repository.
name: Supa-backup

on:
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]
  workflow_dispatch:
  schedule:
    - cron: '0 0 * * *' # Runs every day at midnight
jobs:   
  run_db_backup:
    runs-on: ubuntu-latest
    permissions:
      contents: write
    env:
      supabase_db_url: ${{ secrets.SUPABASE_DB_URL }}   # For example: postgresql://postgres:[YOUR-PASSWORD]@db.<ref>.supabase.co:5432/postgres
    steps:
      - uses: actions/checkout@v3
        with:
          ref: ${{ github.head_ref }}
      - uses: supabase/setup-cli@v1
        with:
          version: latest
      - name: Backup roles
        run: supabase db dump --db-url "$supabase_db_url" -f roles.sql --role-only
      - name: Backup schema
        run: supabase db dump --db-url "$supabase_db_url" -f schema.sql
      - name: Backup data
        run: supabase db dump --db-url "$supabase_db_url" -f data.sql --data-only --use-copy

      - uses: stefanzweifel/git-auto-commit-action@v4
        with:
          commit_message: Supabase backup
More resources#

    Backing up and migrating your project: Migrating and Upgrading (https://supabase.com/docs/guides/platform/migrating-and-upgrading-projects)

from https://supabase.com/docs/guides/cli/github-action/backups
-------------------------------------------------------------------------------

How to Backup Supabase Database

In this post and video, we'll dive into the specifics of creating a local PostgreSQL backup for your Supabase database.
I'll share with you the command you need to use to both create and restore a Supabase backup.

What is Supabase?

Before getting started, let's introduce Supabase!

Supabase is an open-source Firebase alternative, providing developers with a toolkit to build dynamic web and mobile applications.
It offers a PostgreSQL database, authentication services, real-time subscriptions, and auto-generated APIs. One of its key components is the PostgreSQL database, renowned for its robustness and reliability.

In this article we'll focus on how to back up the Supabase PostgreSQL database, which by the way is a provider we like a lot at SimpleBackups.

When do you need a Supabase backup?

While Supabase offers its own backup solutions, relying solely on these can be risky.

Here’s why an external backup is crucial:

Redundancy: Having an external backup ensures that if something goes wrong with Supabase's internal backups, your data is still safe.
Control: External backups give you more control over backup frequency, retention policies, and recovery strategies.
Compliance: For certain applications, regulations may require external backups to meet data protection standards.

We'll develop a straightforward script to back up your PostgreSQL database.
If you want to automate the process, you can use SimpleBackups for Supabase to schedule your backups and store them on Amazon S3, Google Cloud Storage, or any other cloud storage provider.
If you feel like coding it yourself, you can check out our guide on how to automate PostgreSQL backups.

With that being said, let's get started!

How to backup Supabase database

In order to backup your Supabase database, you'll need to use the pg_dump command.

Make sure you have postgres installed on your machine.

If not, you can install it by following the instructions on the official website.

You can also check our article on how to install PostgreSQL on Docker.

For Ubuntu, simply install it via the command line:

sudo apt-get install postgresql-15

Backup your Supabase database with pg_dump

While we won't go deep into how pg_dump works, here's a quick overview of the command.

For those who wants to understand moreabout pg_dump, we wrote an complete guide on how to backup PostgreSQL database that goes into more details.

The format of a Supabase PostgreSQL connection string is similar to a standard PostgreSQL connection string.
It typically includes the following components:

postgresql://myuser:mypassword@db.myproject.supabase.co:5432/mydatabase

Remember to replace myuser, mypassword, db.myproject.supabase.co, and mydatabase with your actual database credentials and details. Also, ensure that your connection string is kept secure and not exposed in your application code or any public repositories.

Now, let's execute the pg_dump command to backup your Supabase database:

Using credentials:

pg_dump --inserts --column-inserts --username=myuser --host=db.myproject.supabase.co --port=5432 mydatabase > database-dump.sql

Or using the connection string:

pg_dump 'postgresql://myuser:mypassword@db.myproject.supabase.co:5432/mydatabase > database-dump.sql

This command will connect to your Supabase database and create a dump file called database-dump.sql in your current directory.

Restore your Supabase database with pg_restore

Now that you have a backup of your Supabase database, you can restore Supabase using the methods shown below.

Using psql:

psql -U username -d dbname < database-dump.sql

Or, using pg_restore:

pg_restore -U username -d dbname -1 database-dump.sql

If you want to restore your Supabase backup on to Supabase itself, just your the propler connection string and credentials.

psql 'postgresql://myuser:mypassword@db.myproject.supabase.co:5432/mydatabase' < database-dump.sql

And that's it! You now know how to backup and restore your Supabase database。

from  https://simplebackups.com/blog/how-to-backup-supabase/

(https://www.youtube.com/watch?v=_iVKjnlQf00)

-----------------------------------------------------------

https://www.reddit.com/r/Supabase/comments/rlq5mk/how_can_i_backup_my_supabase_database_and_buckets/?rdt=57177

name: Supa-backup

on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
workflow_dispatch:
schedule:
- cron: '0 0 * * *' # Runs every day at midnight
jobs:
run_db_backup:
runs-on: ubuntu-latest
permissions:
contents: write
env:
supabase_db_url: ${{ secrets.SUPABASE_DB_URL }} # For example: postgresql://postgres:[YOUR-PASSWORD]@db.<ref>.supabase.co:5432/postgres
steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.head_ref }}
- uses: supabase/setup-cli@v1
with:
version: latest
- name: Backup roles
run: supabase db dump --db-url "$supabase_db_url" -f roles.sql --role-only
- name: Backup schema
run: supabase db dump --db-url "$supabase_db_url" -f schema.sql
- name: Backup data
run: supabase db dump --db-url "$supabase_db_url" -f data.sql --data-only --use-copy

- uses: stefanzweifel/git-auto-commit-action@v4
with:
commit_message: Supabase backup

More resources

 

 

No comments:

Post a Comment