If you are running a dynamic website – WordPress, Joomla, vbulletin, drupal – with several hundred pages (even 50 pages) it is very difficult to disallow or allow range of pages by page using robets.txt. Then you have to use wildcard methods to allow or disallow robots for specific pages that you need.
See the following Examples and understand the dynamic websites URL structure
This kind of blocking can affects SEO too if you have thousands of this kind of statements.
By following wildcard method you can block search engines by accessing all same type dynamic content as shown in following code snippet
This can be identified by the robot and not going to access or index the pages that has same dynamic structure.