I am working on a self-study project but I cannot come up with a reliable answer.
Here is the deal:
I start with 500cm sticks. I have orders from customers who want smaller sticks. I am willing to cut my 500cm stick into as many as 7 pieces.
How can I determine which combination of customer orders to use for each stick to minimize waste?
This is some sample customer order data:
+-----------+--------+
| Firm Name | Length |
+-----------+--------+
| "firm1", | 34, |
| "firm2", | 43, |
| "firm3", | 52, |
| "firm4", | 61, |
| "firm5", | 62, |
...
| "firm26", | 102, |
| "firm27", | 152, |
| "firm28", | 153, |
| "firm29", | 163, |
| "firm30", | 202, |
+-----------+--------+
The important part is the algorithm should give minimize scrap.
I thought that I could use a genetic algorithm but it does not return the correct solution every time.
The important thing is all cutting to pieces work orders must be 500cm in total. Or it should be near to that length according to given waste length. Quantity could be higher than the order and can be put to the stock.
7
This particular problem is a variation on a well known (and well studied) problem known as the cutting stock problem.
In the classic definition of this problem, consider that you have large rolls of product of a given width. You need to cut them into the smaller width rolls with minimum waste (and knife changes):
The reason that this is so well studied is that it has huge impacts on industrial processes with even small gains in efficiency. It shows up in paper (rolls of paper), film (rolls of film), and metal industries with additional interesting variations for each of them (cutting paper to A4 size involves two cuts on different machines that can have different specifications (one being cheaper). The glass industry has another similar problem known as the guillotine problem because it needs to get cut into rectangular pieces rather than working with rolls.
This problem is also equivalent to the knapsack problem.
Knowing these things, there are a number of different algorithms that attempt to solve this class of problem that goes into a too many funky math symbols, conjectures, and other papers.
Good enough answers are possible quickly, perfect answers take a long time because you will need to enumerate most if not all of the possibilities.
1
I did some work in a college operating systems class about allocation of memory blocks with different sized files using the best-fit, worst-fit, first-fit algorithms, which if you think about it is essentially the same process for this problem. The overarching theme was without knowing all of the requests ahead of time you can’t go through all the possible orientations and find the ‘least’ scrap. You can get close using something like best-fit but then a different set of orders could make that worse than first-fit and so on as you can’t guarantee to look forward far enough to optimize everything fully in a realistic system.
Just doing some quick situations with a few sticks and a few orders and varying the order in which the orders come in should explain why any algorithm you have might not always return the ‘most correct’ solution.
You’re asking about a class of equivalent problems i.e. the bin-packing algorithm (which is NP hard see http://en.wikipedia.org/wiki/Bin_packing_problem). You can’t guarantee a perfect solution for any number of customers and sticks within a (reasonable) linear amount of time. What you can do is use heuristics to get a ‘best effort’ solution.
Check out bin packing and think about how to map your problem to the usual best effort solutions is probably the best approach.