There seems to be a consensus that factorization of integers is hard (in some precise computational sense.) Is it known whether polynomial factorization is computationally easy or hard?
One thing I originally thought was that if we could factor polynomials easily, then we could factor n by finding a polynomial p(x) with p(m)=n for some m, then "easily factorizing" p to get p(x)=q(x)\cdot r(x). Then q(m)\cdot r(m) would be a factorization of n. But we wouldn't get any information if one of these happened to be 1.
Does anyone have a solution or a reference for this problem? (I searched online, but all I could find was how to do factorization like in high school problems.)