OptalCP 2026.2.0 Release
This release introduces a drop-in replacement for IBM CP Optimizer's Python API, adds the element expression for array lookups, and includes bug fixes and API refinements.
New Features
CP Optimizer Compatibility
The new optalcp-cpo package is a drop-in replacement for IBM CP Optimizer's Python API. Existing DOcplex CP models can run on OptalCP with a single import change:
# Before:
from docplex.cp.model import *
# After:
from optalcp_cpo.model import *
No other code changes are needed. The package translates DOcplex CP models to OptalCP at solve time. In our testing, 14 out of 20 scheduling examples from the DOcplex repository work and run faster on OptalCP. The scheduling tutorial notebook works through 6 out of 7 chapters.
This is the initial release of the compatibility layer. Not all CP Optimizer features are supported yet — see the optalcp-cpo README for the current status. We are actively working on expanding compatibility.
Element Expression
The new element function creates an integer expression equal to array[subscript], where subscript is a decision variable or expression. It automatically constrains subscript to the valid index range.
In this example, we model a learning effect: the more tasks a worker completes, the faster they get. Each task's duration depends on its position in the processing sequence:
- Python
- TypeScript
# Learning effect: tasks get faster as the worker gains experience
durations = [50, 40, 33, 28, 25]
tasks = [model.interval_var(length=(25, 50), name=f"Task{i}") for i in range(5)]
seq = model.sequence_var(tasks)
model.no_overlap(seq)
for task in tasks:
model.enforce(task.length() == model.element(durations, task.position(seq)))
model.minimize(model.max([t.end() for t in tasks]))
// Learning effect: tasks get faster as the worker gains experience
const durations = [50, 40, 33, 28, 25];
const tasks = Array.from({length: 5}, (_, i) =>
model.intervalVar({ length: [25, 50], name: `Task${i}` }));
const seq = model.sequenceVar({ intervals: tasks });
model.noOverlap(seq);
for (const task of tasks) {
model.enforce(task.length().eq(model.element(durations, task.position(seq))));
}
model.minimize(model.max(tasks.map(t => t.end())));
API Changes
Solver.sendSolution is Now Synchronous
Solver.sendSolution() / Solver.send_solution() is now a synchronous method. It queues the solution for delivery and returns immediately. Remove await from existing calls:
- Python
- TypeScript
# Before:
await solver.send_solution(solution)
# After:
solver.send_solution(solution)
// Before:
await solver.sendSolution(solution);
// After:
solver.sendSolution(solution);
Solver.onSummary Removed
The onSummary / on_summary callback and the SolveSummary type have been removed. Use the SolveResult returned by solve() instead — it contains the same fields:
- Python
- TypeScript
# Before:
def on_summary(summary):
print(f"Time: {summary.duration:.2f}s")
print(f"Solutions: {summary.nb_solutions}")
solver.on_summary = on_summary
result = await solver.solve(model)
# After:
result = await solver.solve(model)
print(f"Time: {result.duration:.2f}s")
print(f"Solutions: {result.nb_solutions}")
// Before:
solver.onSummary = (summary) => {
console.log(`Time: ${summary.duration.toFixed(2)}s`);
console.log(`Solutions: ${summary.nbSolutions}`);
};
const result = await solver.solve(model);
// After:
const result = await solver.solve(model);
console.log(`Time: ${result.duration.toFixed(2)}s`);
console.log(`Solutions: ${result.nbSolutions}`);
Bug Fixes
- Position expression reuse: Fixed an issue where using a
position()expression in multiple constraints could produce incorrect results. - Python sub-expression reuse: Fixed a Python API issue where reusing a sub-expression in multiple places caused degraded performance.