Run, mock and test fake Snowflake databases locally.
pip install fakesnow
Or to install with the server:
pip install fakesnow[server]
fakesnow offers two main approaches for faking Snowflake: in-process patching of the Snowflake Connector for Python or a standalone HTTP server.
Patching only applies to the current Python process. If a subprocess is spawned it won't be patched. For subprocesses, or for non-Python clients, use the server instead.
To run script.py with patching:
fakesnow script.py
Or a module, eg: pytest
fakesnow -m pytest
fakesnow
executes fakesnow.patch
before running the script or module.
Alternatively, use fakesnow.patch in your code:
import fakesnow
import snowflake.connector
with fakesnow.patch():
conn = snowflake.connector.connect()
print(conn.cursor().execute("SELECT 'Hello fake world!'").fetchone())
The following standard imports are automatically patched:
import snowflake.connector.connect
import snowflake.connector.pandas_tools.write_pandas
To patch modules that use the from ... import
syntax, you need to manually specify them, eg: if mymodule.py contains:
from snowflake.connector.pandas_tools import write_pandas
Then patch it using:
with fakesnow.patch("mymodule.write_pandas"):
...
By default, databases are in-memory and will be lost when the process ends. To persist databases between processes, specify a databases path:
with fakesnow.patch(db_path="databases/"):
...
For scenarios where patching won't work (like subprocesses or non-Python clients), you can run fakesnow as an HTTP server:
import fakesnow
import snowflake.connector
# Start the fakesnow server in a context manager
# This yields connection kwargs (host, port, etc.)
with fakesnow.server() as conn_kwargs:
# Connect to the fakesnow server using the yielded kwargs
with snowflake.connector.connect(**conn_kwargs) as conn:
print(conn.cursor().execute("SELECT 'Hello fake server!'").fetchone())
# The server is automatically stopped when exiting the context manager
This starts an HTTP server in its own thread listening for requests on localhost on an available random port. The server accepts any username/password combination.
By default, the server uses a single in-memory database for its lifetime. To configure database persistence or isolation:
# Databases will be saved to the "databases/" directory
with fakesnow.server(session_parameters={"FAKESNOW_DB_PATH": "databases/"}):
...
# Each connection gets its own isolated in-memory database
with fakesnow.server(session_parameters={"FAKESNOW_DB_PATH": ":isolated:"}):
...
To specify a port for the server:
with fakesnow.server(port=12345) as conn_kwargs:
...
fakesnow provides fixtures for easier test integration. Add them in conftest.py:
pytest_plugins = "fakesnow.fixtures"
To autouse the fixture you can wrap it like this in conftest.py:
from typing import Iterator
import pytest
pytest_plugins = "fakesnow.fixtures"
@pytest.fixture(scope="session", autouse=True)
def setup(_fakesnow_session: None) -> Iterator[None]:
# the standard imports are now patched
# Add any additional setup here
yield
# Add any teardown here
For code that uses from ... import
statements:
from typing import Iterator
import fakesnow
import pytest
pytest_plugins = "fakesnow.fixtures"
@pytest.fixture(scope="session", autouse=True)
def _fakesnow_session() -> Iterator[None]:
with fakesnow.patch("mymodule.write_pandas"):
yield
To start a fakesnow server instance, enable the plugin in conftest.py:
pytest_plugins = "fakesnow.fixtures"
And then use the fakesnow_server
session fixture like this:
import snowflake.connector
def test_with_server(fakesnow_server: dict):
# fakesnow_server contains connection kwargs (host, port, etc.)
with snowflake.connector.connect(**fakesnow_server) as conn:
conn.cursor().execute("SELECT 1")
assert conn.cursor().fetchone() == (1,)
Fully supported:
- Standard SQL operations and cursors
- Information schema queries
- Multiple databases
- Parameter binding in queries
- Table comments
- Pandas integration including write_pandas(..) (not available via the server yet)
- Result batch retrieval via get_result_batches()
- HTTP server for non-Python connectors
Partially supported:
- Date functions
- Regular expression functions
- Semi-structured data operations
- Tags
- User management
COPY INTO
from S3 sources, see COPY INTO
Not yet implemented:
For more detail see the test suite.
- Row ordering is non-deterministic and may differ from Snowflake unless you fully specify the ORDER BY clause.
- fakesnow supports a more liberal SQL dialect than actual Snowflake. This means some queries that work with fakesnow might not work with a real Snowflake instance.
COPY INTO
can be used from S3 sources. By default the standard AWS credential chain will be used. If you are getting an HTTP 403 or need to provide alternative S3 credentials you can use the duckdb CREATE SECRET statement. For an example of creating a secret to use a moto S3 endpoint see s3_client
in conftest.py
See CONTRIBUTING.md for instructions on getting started with development and contributing to this project.