(Assuming the programs are of length n generated using random ascii characters 0-127, and the number n starts out at 1, and the next time around is 2, then 3, and so forth.)
We are splitting the programs generated this way into two piles.
First pile are all the programs that don't compile and all the programs that compile and don't rely on size of the array that we want to sort. A very small number of programs compile . As n grows larger the number of programs that compile grows smaller as the chance of a random character error grows larger factorially. As n goes to infinity the ratio of programs that don't compile to programs that do, becomes divergent; in other words: the number of programs that don't compile tends to infinity and programs that do compile tends to zero. Therefore we can assume every program in first pile is a program that doesn't compile.
The second pile are the programs that compile and actually rely on the size of the array. The complexity of these programs is dependent on their algorithm. We are assuming the worst case where every algorithm is of complexity O(n!).
But errors in the programs from second pile also grows factorially. So the number of programs in second pile compared to first also approaches zero as n goes to infinity.
Complexity of programs in the first pile, since they don't rely on the input( where input is the size of the array ), is constant. Where the constant is the number n( steps ).
As n tends to infinity the number of the seconds programs tends to zero and their complexity becomes negligible. Therefore the total complexity becomes either O(1) or O(inf).
If you choose the n to stop at a finite number, then the complexity of the program is O(1)
If you choose n never stop, to become infinite then the algorithm never ends.
Feel free to comment, please refer to specific parts of the solution.
edit: Complexity of programs that fails to compile, or compile and don't rely on the size of the array is O(1), since that specific program always executes in the same time as it takes no variable input that would change regarding to size of the array.
My explanation is not to be understood easily( that doesn't mean it is very clear( or good ) ). Your best bet here would probably be to study big-o notation(+math, function limits) and think of the solution for yourself( there definitely is one ), as it can become quickly confusing as to what are you measuring really.
I understand big-O notation just fine. I asked a specific question about your post, and nothing that indicated I misunderstood the principle in general. You spoke of "choosing n" to be either infinite or non-infinite, which doesn't make sense to me given the algorithm I proposed.
We are splitting the programs generated this way into two piles.
First pile are all the programs that don't compile and all the programs that compile and don't rely on size of the array that we want to sort. A very small number of programs compile . As n grows larger the number of programs that compile grows smaller as the chance of a random character error grows larger factorially. As n goes to infinity the ratio of programs that don't compile to programs that do, becomes divergent; in other words: the number of programs that don't compile tends to infinity and programs that do compile tends to zero. Therefore we can assume every program in first pile is a program that doesn't compile.
The second pile are the programs that compile and actually rely on the size of the array. The complexity of these programs is dependent on their algorithm. We are assuming the worst case where every algorithm is of complexity O(n!).
But errors in the programs from second pile also grows factorially. So the number of programs in second pile compared to first also approaches zero as n goes to infinity.
Complexity of programs in the first pile, since they don't rely on the input( where input is the size of the array ), is constant. Where the constant is the number n( steps ).
As n tends to infinity the number of the seconds programs tends to zero and their complexity becomes negligible. Therefore the total complexity becomes either O(1) or O(inf).
If you choose the n to stop at a finite number, then the complexity of the program is O(1)
If you choose n never stop, to become infinite then the algorithm never ends.
Feel free to comment, please refer to specific parts of the solution.
edit: Complexity of programs that fails to compile, or compile and don't rely on the size of the array is O(1), since that specific program always executes in the same time as it takes no variable input that would change regarding to size of the array.