I made a timeline, to make it clearer:
04/30 - I disallowed crawling to a few directories using robots.txt
05/01 - GWT reported a drop in tracked pages from about 800 to about 50.
05/06 - I set .htaccess to return a 403 error when any directories were accessed, using (Options -Indexes).
05/10 - Traffic/Ranks started to drop, and it's been getting worse since then.
What was responsible for the drop? The robots.txt or the .htaccess policy?