Iterative Reduction to Null Matrix
        Posted  
        
            by 
                user1459032
            
        on Stack Overflow
        
        See other posts from Stack Overflow
        
            or by user1459032
        
        
        
        Published on 2012-12-15T09:22:13Z
        Indexed on 
            2012/12/15
            23:04 UTC
        
        
        Read the original article
        Hit count: 225
        
Here's the problem: I'm given a matrix like
Input:
1 1 1
1 1 1
1 1 1
At each step, I need to find a "second" matrix of 1's and 0's with no two 1's on the same row or column. Then, I'll subtract the second matrix from the original matrix. I will repeat the process until I get a matrix with all 0's. Furthermore, I need to take the least possible number of steps.
I need to print all the "second" matrices in O(n) time. In the above example I can get to the null matrix in 3 steps by subtracting these three matrices in order:
Expected output:
1 0 0
0 1 0
0 0 1
0 0 1
1 0 0
0 1 0
0 1 0
0 0 1
1 0 0
I have coded an attempt, in which I am finding the first maximum value and creating the second matrices based on the index of that value. But for the above input I am getting 4 output matrices, which is wrong:
My output:
1 0 0 
0 1 0 
0 0 1 
0 1 0 
1 0 0 
0 0 0 
0 0 1 
0 0 0 
1 0 0 
0 0 0 
0 0 1 
0 1 0  
My solution works for most of the test cases but fails for the one given above. Can someone give me some pointers on how to proceed, or find an algorithm that guarantees optimality?
Test case that works:
Input:
0 2 1
0 0 0
3 0 0
Output
0 1 0
0 0 0
1 0 0
0 1 0
0 0 0
1 0 0 
0 0 1
0 0 0
1 0 0
© Stack Overflow or respective owner