Note that this was actually a transcript from a talk, rather than a formal paper. The context of the talk was as a debate between the "serial computing" advocates, and the "parallel computing" advocates. There's a chart that illustrates the "expected zone of operation" for most computers -- essentially how much work was expected to be serial in nature, and how much parallel. Some of the analysis work was done by Professor Knight (can't find his full name) at Stanford, so there's a lot of effort before the intuition sets in.
For most work loads, Amdahl expected that parallel machines would give diminishing returns -- and that seems to be the case. No rule against having things that are massively parallel; just not the common case.
Also note that the "Law" was formulated by someone other than Amdahl. In his ICCAD talk, he mentioned that he had given the above presentation, and because of the contentious nature of the whole event, he put it out of his mind. A few years later, he started hearing about the law from others -- and was somewhat surprised that people had taken the ideas in the talk and run with them.