The question asks: "is it possible to have a problem that actually gets easier as the inputs grow in size?" What if the inputs are resources to be used by the algorithm to work on a job. It is common knowledge that the more the resources the better. Below is an example, in which the more there are employees the better.
1) You are given two inputs:
i) The number of employees in an industry. This is a natural number and is the main input $n$.
ii) Information about the industry. There are $t$ tasks to be done by the workers, labeled A, B, C, D... There are $p$ places that connect the tasks enabling the employees to switch between tasks. They are labeled 0, 1, 2, 3... The information given is a simple directed graph made up of routes, for example: A-1, 1-2, 2-B, C-3... For simplicity each route has a cost of 1.
2) Problem:
Starting from the first tasks A, and with $n$ employees, you are to find an optimal strategy for the employees to use in order to visit all the tasks. Tasks can be visited more than once. Recall that you cannot switch between tasks unless you find a path between them. The path from task A to B is independent of that from B to A.
3) Output:
The output is the paths between tasks to be taken by the employees. Each path is associated with the number of employees taking it. For example:
A to B with $n1$ employees (a forward path)
A to C with $n2$ employees (a forward path)
B to D with $n3$ employees (a forward path)
D to B with $n4$ employees (a reversed path)
B to E with $n5$ employees (a forward path)
4) Possible solution:
One possible solution is to first compute the shortest path to the closest nodes from A. This will be a forward path. Then recursively compute the forward path for each visited tasks. The result is a tree. For example:
A
B C
D E
Now is the time to determine how the employees are going to traverse the tree so to visit the tasks. Starting from task A with $n$ employees, $n1$ are sent to the left subtree and $n2$ are sent to the right subtree. If $n2 \neq 0$ then none from the left subtree will ever need to go to the right subtree.
For $n=\infty$ the employees will all just move forward. For $n = 1$ the employee will have to use reverse paths in order to visit other tasks. For D to B for example the algorithm will compute that shortest path. This is extra computation. Why not directly compute a shortest path from D to E?! Fine, hopefully that is still extra computation.
But of course computation time will not decrease infinitely (by the way for $n$ too large it will lead to poor resource management and stuff).
8One real problem with this property that comes to mind is unsalted password hash cracking when it is formulated like “given n hashes, crack at least one hash”. Since cracking speed would scale linearly with n, running time would be proportional to 1/n – except that we can't actually assign a definitive time since cracking is stochastic and does not have a constant upper bound on time. – amon – 2016-01-02T09:09:28.107
2@amon The running time doesn't scale like $1/n$. It takes time $n$ just to read the $n$ hashes you've been given as input! – David Richerby – 2016-01-02T09:33:30.670
3Do you mean easier in absolute or relative terms? Which cost measures do you permit? Do you require strictly decreasing cost, or is non-increasing (from some point on) enough? – Raphael – 2016-01-02T10:25:28.587
2@DavidRicherby In this example, it is legitimate to ignore the cost of reading the input as long as I don't make any statement about the absolute cost. Instead, the speed increases linearly with the input. Therefore, n•T(1)>T(n) even when considering the cost of reading the input. I.e. for this problem it is easier to solve a large input at once rather than splitting up the input, even though the problem is divisible. I am not saying that T(n)>T(n+1) for all n. – amon – 2016-01-02T10:48:56.073
@amon The question asks for a problem that gets easier as the input gets bigger, not a problem that merely benefits from economies of scale. And, in fact, your problem doesn't benefit from economies of scale. There are two ways to make the input bigger: give more hashes, or give longer hashes. The worst-case $n$-bit input is a single $n$-bit hash, not $n/b$ $b$-bit hashes. – David Richerby – 2016-01-02T11:35:40.150
4To everybody who wants to post yet another answer of the form, "Some problem where the input is a question plus a big bunch of hints about the answer": This does not work. The hardest inputs of length $n$ are the ones where you use all $n$ bits to ask the question and give no hints. The fact that it's easy to deal with short questions with a lot of hints does not mean the worst-case running time is good. – David Richerby – 2016-01-02T15:51:39.450
1
If you're willing to substitute 'easier' with 'more valuable' then we could talk about the Network Effect / Metcalfe's Law.
– A E – 2016-01-02T19:09:33.500What about problems that take faster to prove they have "no solution" rather than effectively solving them ? I would not "solve itself" but instead show than "there is obviously no solution". – Uriel – 2016-01-02T20:02:23.547
1@DavidRicherby It's O(n + <extremely large constant>/n). – user253751 – 2016-01-03T01:10:22.610
@immibis Traditionally, $n$ is the number of bits in the input. If I give you a single $n$-bit hash, it's going to take time about $2^n$ to get a match. – David Richerby – 2016-01-03T01:36:06.450
@DavidRicherby Not for a fixed hash function? – user253751 – 2016-01-03T02:08:13.863
1I think the most interesting thing you could hope for is an interesting combinatorial problem (e.g., number of graphs on $n$ nodes with some property $P(n)$) which requires you to enumerate graphs for relatively small $n$, but maybe you can prove the answer is $0$ for, say, $n \ge 50$. (Here's a stupid one: $P(n) =$ has less than 50 nodes.) – Kimball – 2016-01-03T05:59:11.187
1@Kimball but then the solution would be simply to count the number of nodes and stop after 50. But then the complexity would grow and then become constant after a certain threshold...Not easier. Moreover I believe that any problem with less than linear worst case is simply trivial and uninteresting. – Bakuriu – 2016-01-03T16:55:26.667
@Bakuriu I wasn't interpreting easier in the sense of complexity theory, but in a practical sense (which was maybe what the OP meant). There are certainly very interesting and mathematical problems which are hard for some small $n$ but easier for larger $n$. A notable one (though not an answer to the OP) is the generalized Poincare conjecture.
– Kimball – 2016-01-03T17:37:11.567I second what @Kimball wrote, and would like to give known results from Ramsey Theory as such examples: The complexity indeed becomes constant after a certain threshold for many such questions, and therefore in many cases much easier than for smaller instances.
– mat – 2016-01-03T23:16:14.230Not a 'computational problem' as such, but some things based in statistics get easier as volume of inputs increases. For instance 'If a Flip a coin N times, what % will be Heads (To be accurate +-10%)', with n=2, you're very unlikely to get it right. With n = 10000 you're very likely to get it right within 10%. – JeffUK – 2016-01-04T13:35:19.477
"It takes time $n$ just to read the $n$ hashes you've been given as input!" – who said you must read all input? If it takes $O(n + HugeC/n)$ it's optimal to read at most $n' = \sqrt(HugeC)$. OK, that bottoms out to a const. But for evil inputs I'll use randomized brute force, and I think for expected time it's beneficial to read progressively more hashes as the brute force goes on failing? [I'm talking fixed-function $n$ distinct hashes problem; assuming hashes are really hard and strong randomness; worst: assuming unbounded O(1) RAM hides a lie!] – Beni Cherniavsky-Paskin – 2016-01-05T14:48:56.523
I cheated: "fixed $b$-bit function $n$ distinct hashes" has no legal inputs for $n>2^b$! Also, brute force can never take more than $HugeC \cdot 2^b$, so at some point there won't be a reason to read more hashes. But imagine a family of $\log n$-wide hash functions with $n/log n$ inputs? Point is, "hit at least one target" does get easier in a strong sense the more targets you're given, and it's not obvious that can't be captured formally. There might be a deep reason this is impossible, but "you must read all input" or "T is integer>0 so can't decrease forever" feel shallow... – Beni Cherniavsky-Paskin – 2016-01-05T16:01:18.397