Its a tricky balance that depends a lot on the game and the scenery that has to be rendered.

For some games, you can get very decent visuals with a decreased resolution and 2x/4x AA, other developers will prefer a 1280x720 resolution and no AA, although most games run at present 1280x720 with 0xAA and hover around 30FPS.

Unfortunately, all of these solutions are a compromise around the problem that games are pushing the limits of console's GPUs. Developers want to create bigger environments with higher res textures and 3D models with larger polygon counts for more detail. Optimization of their engines allows them a little more leeway, but you can only optimize so much before you just need better hardware. The requirements for games, even console games do not get less, they continue to rise and devs have to make compromises.

At 1280x720 and 0xAA, there is a pretty severe problem with aliasing, while at 1024x600 and 2xAA there will be less of an aliasing problem, but then you have the other issue that the screens have to upscale smaller video input no where near the native resolutions, so you get a similar effect to zooming in, pixelation.

Hopefully, the next gen of consoles will make it their goal to run all their games at at least 1920x1080, because I can vouch that at that resolution, aliasing is almost a non-issue, 2xAA at most is required, which cuts out the need for the extremely demanding process of anti-aliasing.