Well, finding the chunk of binary data that makes sense amidst a sea of noise is already practised in forensic computing, but I agree that this isn't quite the same problem. My short answer is that I don't have a real answer.
My long answer is pretty much conjecture a bunch of conjecture - do we need to build a bunch of different parallel systems that each model one of the possible states for c simultaneously, and se which one makes sense? I hope not. I think/hope that some of the heavy lifting is going to be done by the guys thinking about quantum hard drives. I'm hoping that just as we have techniques for finding legitimate file systems on hard drives today, in the future we will have techniques for identifying valid file systems within quantum superposed datasets. So c comes out as a quantum memory space, and we apply these techniques to scan the superposed state for valid data? As I say, I don't really have hard answers...
My long answer is pretty much conjecture a bunch of conjecture - do we need to build a bunch of different parallel systems that each model one of the possible states for c simultaneously, and se which one makes sense? I hope not. I think/hope that some of the heavy lifting is going to be done by the guys thinking about quantum hard drives. I'm hoping that just as we have techniques for finding legitimate file systems on hard drives today, in the future we will have techniques for identifying valid file systems within quantum superposed datasets. So c comes out as a quantum memory space, and we apply these techniques to scan the superposed state for valid data? As I say, I don't really have hard answers...