Python Driver¶
A lightweight, in-process Python API for querying, mutating, and integrating with the Velr graph database.
The Python driver exposes:
- Connection management (
Velr.open, context manager) - Executing Cypher queries (
run,execute,exec,exec_one) - Streaming result tables (
Stream,StreamTx,Table,Rows,Cell) - Typed conversions to PyArrow, pandas, and Polars
- Transactional control (
begin_tx,VelrTx) - Savepoints (
Savepoint) -
Binding external data into Cypher via
BIND(...): -
PyArrow columns / tables (
bind_arrow) - pandas (
bind_pandas) - Polars (
bind_polars) - NumPy (
bind_numpy) - List-of-dicts (
bind_records)
Everything runs in-process with no network hop.
Opening a Database¶
Velr.open(path: str | None) -> Velr
None→ in-memory database"path"→ file-backed database at that path
You can also use Velr as a context manager:
When the context exits, the connection is closed.
Executing Queries¶
run: execute and discard results¶
Use run (or its alias execute) for queries where you don’t care about the result tables — typical for DDL and writes:
db.run("CREATE (:User {name: 'Alice'})")
db.execute("MATCH (u:User {name:'Alice'}) SET u.active = true")
- Drains all result tables and discards them.
- Suitable for
CREATE,MERGE,SET,DELETE, or scripts.
exec: stream result tables¶
Use exec when your query returns rows and may produce one or more tables:
with db.exec("MATCH (u:User) RETURN u.name AS name") as stream:
while True:
table = stream.next_table()
if table is None:
break
with table:
print(table.column_names()) # ['name']
with table.rows() as rows:
for row in rows:
cell = row[0] # Cell
name = cell.as_python()
print("User:", name)
Key methods on Stream:
next_table() -> Table | None— fetch the next table (orNoneif done)iter_tables()— generator that yields eachTable- Context manager (
with Stream(...) as st:) — ensures the stream is closed
exec_one: expect exactly one table¶
Use exec_one when you expect exactly one result table:
table = db.exec_one(
"MATCH (u:User) RETURN u.name AS name, u.age AS age"
)
with table:
print(table.column_names()) # ['name', 'age']
Behavior:
- Raises
VelrErrorif the query returns no tables. - Raises
VelrErrorif it returns more than one table. - Caller is responsible for closing the returned
Table(or using it as a context manager).
Working with Tables, Rows, and Cells¶
Table¶
A Table represents one result table from a query.
Main methods:
column_names() -> list[str]rows() -> Rows- Arrow / pandas / Polars helpers (see below)
- Context manager (
with table:) automatically closes the table
Example:
table = db.exec_one("MATCH (m:Movie) RETURN m.title AS title, m.released AS year")
with table:
print(table.column_names()) # ['title', 'year']
with table.rows() as rows:
for row in rows:
title_cell, year_cell = row # tuple[Cell, Cell]
title = title_cell.as_python()
year = year_cell.as_python()
print(f"{title} ({year})")
Rows¶
Rows is an iterator over result rows for a given table.
- Each iteration yields a
tuple[Cell, ...] - Context manager ensures underlying resources are properly closed
Cell and Cell.as_python¶
Each Cell is a lightweight value wrapper:
from velrpy import Cell # imported implicitly via results
# Cell fields (mostly for advanced use)
# ty: int # internal type tag
# i64: int # integer representation
# f64: float # floating representation
# data: bytes # raw bytes for text/json/binary
Most of the time you’ll just use:
Cell.as_python(...) returns:
Nonefor nullboolfor booleansintfor integer valuesfloatfor floating-point valuesstrorbytesfor text / JSON-
For JSON:
-
If
parse_json=True, returns decoded Python objects (dict/list/etc.) - If
parse_json=False, returns the raw string (or bytes ifdecode_text=False)
Example: reading JSON:
table = db.exec_one("MATCH (e:Event) RETURN e.payload AS payload")
with table, table.rows() as rows:
for row in rows:
payload_cell = row[0]
payload = payload_cell.as_python(parse_json=True)
# payload is now a dict/list/etc., if stored as JSON
print(payload)
Converting Results to PyArrow, pandas, and Polars¶
There are two layers:
- Convert the result of a query directly from the connection (
Velr) - Convert a single result table (
Table)
From a query (Velr)¶
These helpers execute a Cypher query, take the last result table, and convert it:
# PyArrow Table
pa_table = db.to_pyarrow("MATCH (m:Movie) RETURN m.title, m.released")
# pandas DataFrame
df = db.to_pandas("MATCH (u:User) RETURN u.name, u.age")
# Polars DataFrame
pl_df = db.to_polars("MATCH (s:Sensor) RETURN s.id, s.value")
Notes:
- All result tables are drained; only the last table is converted.
- If the query returns no tables,
to_pyarrowreturns an emptypyarrow.Table.
From a single table (Table)¶
If you’re already working with a Table, you can convert it directly:
table = db.exec_one("MATCH (m:Movie) RETURN m.title, m.released")
with table:
pa_table = table.to_pyarrow()
df = table.to_pandas()
pl_df = table.to_polars()
Additional low-level helpers on Table:
to_pyarrow_zero_copy()→pyarrow.Bufferover an IPC streamto_ipc_memoryview()→memoryviewof IPC bytesto_ipc_bytes()→ rawbytesof IPC filewrite_ipc_to(fileobj)→ write IPC bytes to a file-like objectsave_ipc(path: str)→ save IPC bytes directly to disk
Transactions¶
Velr supports explicit transactions via Velr.begin_tx().
Starting a transaction¶
Or using a context manager:
with db.begin_tx() as tx:
tx.run("CREATE (:Person {name: 'Neo'})")
# commit / rollback manually, or rely on auto-rollback on exit
Running queries inside a transaction¶
Inside a transaction, you use the same methods: run, execute, exec, and exec_one — just on VelrTx.
with db.begin_tx() as tx:
tx.run("CREATE (:Person {name:'Neo'})")
table = tx.exec_one(
"MATCH (p:Person) RETURN count(p) AS c"
)
with table, table.rows() as rows:
for (count_cell,) in rows:
print("Count:", count_cell.as_python())
VelrTx provides:
run(cypher: str) -> Noneexecute(cypher: str) -> None(alias forrun)exec(cypher: str) -> StreamTxexec_one(cypher: str) -> Table
Commit / Rollback¶
tx = db.begin_tx()
try:
tx.run("CREATE (:Person {name:'Trinity'})")
tx.commit()
except Exception:
tx.rollback()
raise
With a context manager, the behavior is:
- If you don’t call
commit(), the transaction is rolled back on exit. - On any exception inside the
withblock, the transaction is rolled back.
with db.begin_tx() as tx:
tx.run("CREATE (:Person {name:'Smith'})")
# If this block exits without tx.commit(), tx is rolled back automatically.
Savepoints¶
Savepoints let you partially roll back within a transaction.
Creating and using a savepoint¶
with db.begin_tx() as tx:
# Initial work
tx.run("CREATE (:Temp {k: 'before'})")
sp = tx.savepoint()
tx.run("CREATE (:Temp {k: 'after'})")
# Decide to undo the last part only
sp.rollback()
tx.commit()
Using a Savepoint as a context manager:
with db.begin_tx() as tx:
tx.run("CREATE (:Temp {stage: 1})")
# If anything fails inside this block, we rollback to the savepoint.
with tx.savepoint() as sp:
tx.run("CREATE (:Temp {stage: 2})")
# If this block exits cleanly, the savepoint is released automatically.
Savepoint methods:
release()— mark the savepoint as successful; cannot rollback afterwards.rollback()— rollback to this savepoint and release it.close()— best-effort close (used internally; context managers callreleaseorrollbackdepending on outcome).
Binding External Data with BIND(...)¶
You can expose external data to Cypher as logical tables, then read them with BIND('name') from Cypher.
On the Python side, you bind data with methods on either:
Velr(connection-scoped)VelrTx(transaction-scoped)
Supported bindings:
- PyArrow arrays/tables (
bind_arrow) - pandas DataFrames (
bind_pandas) - Polars DataFrames (
bind_polars) - NumPy data (
bind_numpy) - List-of-dicts (
bind_records)
In Cypher, you typically use:
1. Binding PyArrow columns or tables¶
import pyarrow as pa
from velrpy import Velr
db = Velr.open()
# From a pyarrow.Table
tbl = pa.table({
"name": ["Alice", "Bob"],
"age": [30, 25],
})
db.bind_arrow("people", tbl)
db.run("""
UNWIND BIND('people') AS r
CREATE (:Person {name: r.name, age: r.age})
""")
bind_arrow(logical: str, columns) accepts:
dict[str, pa.Array | pa.ChunkedArray]list[(str, pa.Array | pa.ChunkedArray)]pyarrow.RecordBatchpyarrow.Table
Transaction-scoped version:
with db.begin_tx() as tx:
tx.bind_arrow("people_tx", tbl)
tx.run("UNWIND BIND('people_tx') AS r CREATE (:Person {name:r.name})")
tx.commit()
2. Binding pandas DataFrames¶
import pandas as pd
from velrpy import Velr
df = pd.DataFrame(
{"name": ["Alice", "Bob"], "score": [10, 20]}
)
db.bind_pandas("players", df)
db.run("""
UNWIND BIND('players') AS r
CREATE (:Player {name: r.name, score: r.score})
""")
Signature:
bind_pandas(
logical: str,
df: pd.DataFrame,
*,
index: bool = False,
jsonify_objects: bool = True,
schema: pa.Schema | None = None,
)
Defaults:
index=False— the pandas index is not included as a column.jsonify_objects=True—objectcolumns with dicts/lists/etc. are encoded as JSON strings (orNone), to keep Arrow schemas simple.
Transaction-scoped version: VelrTx.bind_pandas(...) (same signature).
3. Binding Polars DataFrames¶
import polars as pl
from velrpy import Velr
df = pl.DataFrame(
{"name": ["Alice", "Bob"], "active": [True, False]}
)
db.bind_polars("users_pl", df)
db.run("""
UNWIND BIND('users_pl') AS r
CREATE (:User {name: r.name, active: r.active})
""")
Signature:
- Set
rechunk=Trueif you want Polars to consolidate buffers before binding.
Transaction-scoped version: VelrTx.bind_polars(...) (same signature).
4. Binding NumPy data¶
import numpy as np
from velrpy import Velr
db = Velr.open()
data = {
"name": np.array(["Alice", "Bob"], dtype="object"),
"age": np.array([30, 25], dtype="int64"),
}
db.bind_numpy("np_people", data)
db.run("""
UNWIND BIND('np_people') AS r
CREATE (:Person {name: r.name, age: r.age})
""")
Or with a 2D array:
arr = np.array([[1.0, 2.0],
[3.0, 4.0]])
names = ["x", "y"]
db.bind_numpy("points", arr, names=names)
Signature:
-
datacan be: -
dict[str, np.ndarray] 2D np.ndarray(must providenames=[...])- Optional
typeslets you override Arrow types per column.
Transaction-scoped version: VelrTx.bind_numpy(...).
5. Binding list-of-dicts¶
For small or ad-hoc datasets:
rows = [
{"name": "Alice", "age": 30},
{"name": "Bob", "age": 25},
]
db.bind_records("records_people", rows)
db.run("""
UNWIND BIND('records_people') AS r
CREATE (:Person {name: r.name, age: r.age})
""")
Signature:
typescan optionally specify Arrow dtypes.
Transaction-scoped version: VelrTx.bind_records(...).
Error Handling¶
Most operations can raise VelrError:
from velrpy import Velr, VelrError
try:
db = Velr.open("my.db")
db.run("MALFORMED CYPHER")
except VelrError as e:
print("Velr error:", e)
Common cases:
- Cypher syntax errors
- Type mismatches
- Transaction errors
- Invalid binding inputs
VelrError inherits from RuntimeError.
Complete Example¶
from velrpy import Velr, VelrError
def main():
db = Velr.open()
# Create graph
db.run("CREATE (:Movie {title:'Inception', released:2010})")
# Query and print
table = db.exec_one(
"MATCH (m:Movie) RETURN m.title AS title, m.released AS year"
)
with table, table.rows() as rows:
for row in rows:
title = row[0].as_python()
year = row[1].as_python()
print(f"{title} ({year})")
if __name__ == "__main__":
try:
main()
except VelrError as e:
print("Velr error:", e)
API Summary¶
Class: Velr¶
Velr.open(path: str | None = None) -> Velrclose() -> None- Context manager support:
__enter__,__exit__
Query execution:
run(cypher: str) -> Noneexecute(cypher: str) -> None(alias forrun)exec(cypher: str) -> Streamexec_one(cypher: str) -> Table
Result conversion:
to_pyarrow(cypher: str) -> pa.Tableto_pandas(cypher: str, **kwargs) -> pd.DataFrameto_polars(cypher: str) -> pl.DataFrame
Bindings:
bind_arrow(logical: str, columns) -> Nonebind_pandas(logical: str, df: pd.DataFrame, *, index=False, jsonify_objects=True, schema=None) -> Nonebind_polars(logical: str, df: pl.DataFrame, *, rechunk=False) -> Nonebind_numpy(logical: str, data, *, names: list[str] | None = None, types: dict | None = None) -> Nonebind_records(logical: str, rows: list[dict], *, types: dict | None = None) -> None
Transactions:
begin_tx() -> VelrTx
Class: Stream (outside transaction)¶
next_table() -> Table | Noneiter_tables() -> Iterator[Table]close() -> None- Context manager support
Class: StreamTx (inside transaction)¶
next_table() -> Table | Noneiter_tables() -> Iterator[Table]close() -> None- Context manager support
Class: Table¶
column_names() -> list[str]rows() -> Rows-
Arrow / IPC helpers:
-
to_pyarrow_zero_copy() -> pa.Buffer to_pyarrow() -> pa.Tableto_pandas(**kwargs) -> pd.DataFrameto_polars() -> pl.DataFrameto_ipc_memoryview() -> memoryviewto_ipc_bytes() -> byteswrite_ipc_to(fileobj) -> intsave_ipc(path: str) -> Noneclose() -> None- Context manager support
Class: Rows¶
- Python iterator: yields
tuple[Cell, ...] close() -> None- Context manager support
Class: Cell¶
-
Fields (primarily for advanced use):
-
ty: int i64: intf64: floatdata: bytes-
Methods:
-
as_python(*, decode_text=True, parse_json=False, encoding="utf-8", errors="strict")
Class: VelrTx¶
-
Query execution:
-
run(cypher: str) -> None execute(cypher: str) -> None(alias)exec(cypher: str) -> StreamTxexec_one(cypher: str) -> Table-
Transaction control:
-
commit() -> None rollback() -> Noneclose() -> None-
Savepoints:
-
savepoint() -> Savepoint -
Bindings (transaction-scoped):
-
bind_arrow(logical: str, columns) -> None bind_pandas(...) -> Nonebind_polars(...) -> Nonebind_numpy(...) -> Nonebind_records(...) -> None- Context manager support (auto-rollback on exit if not committed)
Class: Savepoint¶
release() -> Nonerollback() -> Noneclose() -> None-
Context manager support:
-
clean exit →
release() - exception →
rollback()