This is not true in any sense, but it is what some people think when they don't know much about the underlying principles and just see fractional pixels.
A blur would be lowering the frequency of an input signal, anti-aliasing is representing that signal more accurately when quantizing it into discreet values.
Do some animated aliased 3D renders then try to blur it to get the same result as an anti-aliased version.
Look at a checkerboard pattern as it goes into the distance. The pattern eventually converges into grey if it is antialiased because the integral of everything under the pixel is grey as the squares end up smaller than a pixel. Blurring the entire frame gives a much different result.