Hallucinations are unstoppable // LLMs are unable to work with unknown tasks
Original Puzzle: There are 4 persons (A, B, C and D) who want to cross a bridge at night.
- A takes 1 minute to cross the bridge.
- B takes 2 minutes to cross the bridge.
- C takes 5 minutes to cross the bridge.
- D takes 8 minutes to cross the bridge.
There is only one torch with them and the bridge cannot be crossed without the torch. There cannot be more than two persons on the bridge at any time, and when two people cross the bridge together, they must move at the slower person’s pace.
Total time spent: 3 + 10 + 2 = 15 minutes.
To minimize the time:
The trick here is the persons with the fastest speeds only should come back (and that too only if there is a need to come back, as here we need to bring back the torch). A comes back in step-1 and B comes back in step-2. And, finally reduce the number of traveling back, like C, D does not come back.
Great puzzle, but what what if we will change the conditions. Only one thing. Let’s find longest time.
Models hallucinate! // there is no logical reasoning
What about new “reasoning” models?
LLMs are unable to operate with logic, as any NNs they just able to reproduce patterns. No pattern in train == no solution in inference.
The same trick is applicable to other puzzles, take one, change some conditions == gotcha!