I remember back in the 90’s when you bought a game on a console that was the final product. A developer knew that they were not going to have any opportunity to fix, change or add anything to this game so they had to make sure the final product was finished. Once released the game would have to stand on its own, all content must be in the game and all bugs must be worked out or it would flop in sales. Things have changed a lot since then.
Publishers seem to be adapting a “release it now, fix it later” mentality that causes a freshly released game to often be an inferior product until some patches down the road. A good modern example is the console version of Fallout: New Vegas being riddled with hundreds of bugs. I can understand the reasoning behind some of this mentality since games are a lot more complex than they were in the past but no game should be released without proper testing. As consumers the console we purchase should be finished and near perfect since all hardware will be the same for each player.
Why has the standard for console games gone so far downhill? Is it so wrong to expect a console game to be free of most if not all bugs? PC games are a different story with the literally thousands of possible combinations of hardware, software, etc. that can make each PC different from the next. Minor bugs happen, but game breaking bugs should be hammered out before a games release especially on consoles.