Keywords: Algorithm Analysis | Logarithmic Factorial | Asymptotic Complexity
Abstract: This article delves into the proof of the asymptotic equivalence between log(n!) and n·log(n). By analyzing the summation properties of logarithmic factorial, it demonstrates how to establish upper and lower bounds using n^n and (n/2)^(n/2), respectively, ultimately proving log(n!)=Θ(n·log(n)). The paper employs rigorous mathematical derivations, intuitive explanations, and code examples to elucidate this core concept in algorithm analysis.
Introduction
In algorithm complexity analysis, the asymptotic behavior of logarithmic factorial log(n!) is a significant topic. This article aims to prove that log(n!)=Θ(n·log(n)), meaning log(n!) is asymptotically equivalent to n·log(n). This conclusion is particularly crucial in analyzing algorithms involving permutations and combinations, such as certain sorting and searching algorithms.
Mathematical Foundation and Problem Definition
First, recall the basic properties of factorial and logarithm. The factorial n! is defined as n×(n-1)×⋯×2×1, so log(n!) can be expressed as a sum of logarithms:
log(n!) = log(1) + log(2) + ... + log(n-1) + log(n)
Our goal is to prove log(n!) = Θ(n·log(n)), which requires establishing both the upper bound O(n·log(n)) and the lower bound Ω(n·log(n)).
Upper Bound Proof: O(n·log(n))
To prove the upper bound, we leverage the monotonicity of the logarithmic function. For each k from 1 to n, log(k) ≤ log(n) because k ≤ n. Thus:
log(1) + log(2) + ... + log(n) ≤ log(n) + log(n) + ... + log(n) = n·log(n)
This shows that log(n!) ≤ n·log(n), so log(n!) = O(n·log(n)). This proof is intuitive and straightforward, requiring no complex derivations.
Lower Bound Proof: Ω(n·log(n))
The lower bound proof is slightly more intricate. We discard the first half of the sum and consider only terms from n/2 to n. For k ≥ n/2, log(k) ≥ log(n/2). Therefore:
log(1) + ... + log(n) ≥ log(n/2) + ... + log(n) ≥ (n/2)·log(n/2)
Further simplifying log(n/2):
log(n/2) = log(n) - log(2)
Hence:
(n/2)·log(n/2) = (n/2)·(log(n) - log(2)) = (n/2)·log(n) - (n/2)·log(2)
For sufficiently large n, (n/2)·log(n) dominates this term, so log(n!) = Ω(n·log(n)). Combining with the upper bound, we obtain log(n!) = Θ(n·log(n)).
Intuitive Explanation and Supplementary Analysis
The hint to use n^n and (n/2)^(n/2) stems from their natural bounding properties. n^n is an obvious upper bound for n! since n! ≤ n^n; while (n/2)^(n/2) approximates a lower bound, with more precise analysis possible via Stirling's formula. The referenced article mentions that automated tests might misjudge complexity, but the proof herein is rigorous.
Code Example and Verification
The following Python code demonstrates the asymptotic behavior of log(n!) and n·log(n):
import math
def log_factorial(n):
return sum(math.log(i) for i in range(1, n+1))
def n_log_n(n):
return n * math.log(n)
# Test n=10, 100, 1000
for n in [10, 100, 1000]:
log_fact = log_factorial(n)
n_log = n_log_n(n)
ratio = log_fact / n_log
print(f"n={n}: log(n!)={log_fact:.2f}, n*log(n)={n_log:.2f}, ratio={ratio:.2f}")
Output shows the ratio approaching 1, verifying asymptotic equivalence. In the code, the log_factorial function computes log(n!), n_log_n computes n·log(n), and comparing ratios visually illustrates the relationship.
Conclusion
Through rigorous mathematical derivation and code verification, we have proven that log(n!)=Θ(n·log(n)). This result has broad applications in algorithm design, such as analyzing lower bounds of comparison-based sorting algorithms. Understanding this proof deepens comprehension of complexity theory.