Skip to main content

Solver Architecture

OptalCP's architecture differs from traditional in-process solver libraries. Understanding this design helps explain the asynchronous API, debugging workflow, and performance characteristics.

Process Model

The OptalCP solver runs as a separate executable process, not as an in-process library:

</>Your Application(Python / TypeScript)OptalCP APIThin wrapperJSON over stdin/stdoutoptalcp executable(C++ solver)Separate process

Summary

  1. Separate Process: The solver runs as optalcp (or optalcp.exe on Windows)
  2. JSON Communication: Model, parameters, and solutions are serialized to JSON
  3. Standard I/O: Communication uses stdin (input) and stdout (output)
  4. Language Bindings: The Python and TypeScript APIs are thin wrappers managing the subprocess

Communication Protocol

When you call model.solve(), the following happens:

  1. Model Serialization: Model and parameters are serialized to JSON
  2. Process Launch: The optalcp executable is spawned as a subprocess
  3. Model Transmission: JSON is sent to the solver via stdin
  4. Solving: The solver searches for solutions
  5. Result Streaming: Solutions and logs are sent back via stdout as JSON
  6. Process Exit: Solver terminates when done
  7. Result Deserialization: API parses JSON and returns SolveResult

Benefits

  1. Easy Installation: Binary packages include pre-compiled solver—no compilation required
  2. Process Isolation: Solver crashes don't crash your application
  3. Parallel Solves: Multiple solves can run in separate processes
  4. Standalone Debugging: Run the solver independently with saved JSON models
  5. Consistent Behavior: Same C++ solver for Python and TypeScript

Subprocess Lifecycle

The Solver class manages the subprocess:

  1. Creation: solver = cp.Solver() (no process yet)
  2. Spawn: Process starts when solve() is called
  3. Termination: Process exits when solving completes or stop() is called

Each solve() call spawns a new process. Processes are not reused between solves.

Finding the Solver Executable

The API searches for the optalcp executable in this order:

  1. Environment variable: OPTALCP_SOLVER (absolute path)
  2. Package directory: Bundled with the Python/npm package
  3. System PATH: Looks for optalcp in PATH

For debugging, you can set the environment variable explicitly:

export OPTALCP_SOLVER=/path/to/optalcp
python my_script.py

Parallel Solves

You can run multiple solves in parallel by using separate processes:

import asyncio
import optalcp as cp

async def solve_instance(model, params):
solver = cp.Solver()
return await solver.solve(model, params)

# Solve multiple instances in parallel
results = await asyncio.gather(
solve_instance(model1, params),
solve_instance(model2, params),
solve_instance(model3, params)
)

Each Solver instance manages its own subprocess, allowing true parallelism.

See Also