The cutting optimization problem belongs to the class of Nondeterminist Polynomial Complete (NPComplete) problems [1]. Other problems in this class are the Hamiltonian path, Travelling Salesman, Subset sum, Clique, Independent set, Graph colouring etc. All these problems have been deeply analyzed by a huge number of researchers, but no polynomialtime algorithm was discovered for them. This has a direct consequence over the running time and the quality of the optimization.
A polynomialtime algorithm is that one whose running time is bounded by a polynomial function of its input size. For instance, if we have n = 1000 pieces to cut and the cutting algorithm would have the complexity O(n^{2}), then the running time would have been directly and linear proportional to 1000^{2} which is (10^{6}) units of time. Assuming that our computers can perform 10^{9} operations per second, the cutting optimization algorithm would run in less than a fraction of a second. Sadly, this is not the case for the cutting optimization problem. There is no such fast algorithm for solving it.
The only perfect algorithm for solving the cutting optimization problem is an exponential one. An exponential algorithm will run in an exponential amount of time (2^{n}, 3^{n}, n! – where n is the number of pieces to be optimized). Thus, even for small instances (lets say 1000 pieces) an exponential algorithm will run as follows:
 if the complexity is 2^{n} , then the total number of operations is 2^{1000} which can be approximated by 10^{300}. Knowing that our computers can perform 10^{9} operations / second we need 10^{291} second to run the algorithm. One year has about 10^{8} seconds. This means that our algorithm would run in 10^{283} years. This is a huge value compared to the age of the universe which is only 10^{9} years.
 if the complexity is 3^{n} , then the total number of operations is 3^{1000} which can be approximated by 10^{477}.
 if the complexity is n! , then the total number of operations is 1000! which can be approximated by 10^{2567}.

These algorithms run in an impressive number of years. Even if we put all computers in the world to solve the problem in parallel we still don’t get a significant improvement in speed.
This is why another possibility (which is employed by our software too) is to use heuristics or approximation algorithms. A heuristic is an algorithm which is fast and returns a good solution (often the best one) of the problem. However,there is no guarantee that the obtained solution is the optimal one.
An important parameter of the software is the OptimizationLevel. This will basically tell how many configurations are explored before the best found solutions is outputted. If you set the OptimizationLevel to very low value you will obtain a solution very fast. But the quality of the solution might be not so good. If you set the OptimizationLevel to very high value you will obtain a good solution but not so fast. Thus, one must employ a tradeoff between the quality of the solutions and the running time.
References
[1]. Garey, M.R., Johnson D.S., Computers and Intractability: A Guide to NPcompleteness, Freeman & Co, San Francisco, USA, 1979.