Quote Originally Posted by Tellos Athenaios View Post
My opinion is that once you take the step from "here's an algorithm which lets you compute this kind of problem" to "here's how you can deduce what algorithm will compute your problem" the numbers switch from being the things that make life difficult to fairly arbitrary constants. In other words when you take the step from computations to math. If you the question is about math the numbers don't matter at all.
I have no objections to this, other than the remaining problem of how it's still easier to find algorithms hen higher numbers are present. It's not really a talking point, it's a simple fact one must be aware of. Higher numbers are not more difficult because they're harder, they're more difficult because our brains tends to mess up more. For example, adding an extra zero when writing "100000" is a lot easier to do than when writing "10".

Increasing the numbers is a nifty way to test an algorithm(or your ability to produce an algorithm) without actually adding any new problems to the original problem. It's comparable to the satanic practice of adding an irrelevant piece of information to a text problem("calculate the m2 of a room with sides of 2m and 4m" is easier than "calculate the m2 of a room with sides of 2m and 4m and a height of 3m").