prove that, for all real values of a ,
the equation x^2+2ax+2a^2+a+1=0 has no real roots for x
Put x^2+2ax+2a^2+a+1=0 into B^2-4AC.
B^2-4AC
=(2a)^2-(4)(1)(2a^2+a+1)
=4a^2-8a^2-4a-4
=-4a^2-4a-4
=-4a^2-4a+8-12
=-4(a^2+a-2)-12
=-4(a+2)(a-1)-12
The largest value of -4(a+2)(a-1)-12 is, -4[(-0.5)+2][(-0.5)-1)-12=(-4)(-1.5)(1.5)-12=9-12=-3
So, the equation x^2+2ax+2a^2+a+1=0 has no real roots for x