Solver Architecture
OptalCP's architecture differs from traditional in-process solver libraries. Understanding this design helps explain the asynchronous API, debugging workflow, and performance characteristics.
Process Model
The OptalCP solver runs as a separate executable process, not as an in-process library:
Summary
- Separate Process: The solver runs as
optalcp(oroptalcp.exeon Windows) - JSON Communication: Model, parameters, and solutions are serialized to JSON
- Standard I/O: Communication uses stdin (input) and stdout (output)
- Language Bindings: The Python and TypeScript APIs are thin wrappers managing the subprocess
Communication Protocol
When you call model.solve(), the following happens:
- Model Serialization: Model and parameters are serialized to JSON
- Process Launch: The
optalcpexecutable is spawned as a subprocess - Model Transmission: JSON is sent to the solver via stdin
- Solving: The solver searches for solutions
- Result Streaming: Solutions and logs are sent back via stdout as JSON
- Process Exit: Solver terminates when done
- Result Deserialization: API parses JSON and returns
SolveResult
Benefits
- Easy Installation: Binary packages include pre-compiled solver—no compilation required
- Process Isolation: Solver crashes don't crash your application
- Parallel Solves: Multiple solves can run in separate processes
- Standalone Debugging: Run the solver independently with saved JSON models
- Consistent Behavior: Same C++ solver for Python and TypeScript
Subprocess Lifecycle
The Solver class manages the subprocess:
- Creation:
solver = cp.Solver()(no process yet) - Spawn: Process starts when
solve()is called - Termination: Process exits when solving completes or
stop()is called
Each solve() call spawns a new process. Processes are not reused between solves.
Finding the Solver Executable
The API searches for the optalcp executable in this order:
- Environment variable:
OPTALCP_SOLVER(absolute path) - Package directory: Bundled with the Python/npm package
- System PATH: Looks for
optalcpin PATH
For debugging, you can set the environment variable explicitly:
export OPTALCP_SOLVER=/path/to/optalcp
python my_script.py
Parallel Solves
You can run multiple solves in parallel by using separate processes:
- Python
- TypeScript
import asyncio
import optalcp as cp
async def solve_instance(model, params):
solver = cp.Solver()
return await solver.solve(model, params)
# Solve multiple instances in parallel
results = await asyncio.gather(
solve_instance(model1, params),
solve_instance(model2, params),
solve_instance(model3, params)
)
import * as CP from '@scheduleopt/optalcp';
async function solveInstance(model: CP.Model, params: CP.Parameters) {
const solver = new CP.Solver();
return await solver.solve(model, params);
}
// Solve multiple instances in parallel
const results = await Promise.all([
solveInstance(model1, params),
solveInstance(model2, params),
solveInstance(model3, params)
]);
Each Solver instance manages its own subprocess, allowing true parallelism.
See Also
- Async Solving - Using the Solver class with callbacks
- Model Export - JSON serialization details