Many of us have heard something along the lines of Shakespeare's vocabulary being estimated at 30,000 individual words. I have also heard that some under-educated children in the inner-city may have a functional vocabulary of less than 600 words. That got me to thinking about how that kind of calculation is done. I am wondering if there is some kind of program that will count the individual words in a documents, cull out the repetitions, and give you back a number of how many words are in a person's working vocabulary. Something like MS Word's "word count" feature with a tweak.
Could you create a long running document of every piece of original text that you have generated and run it through this program to give you a number? Maybe a word list could be generated that could be checked against dictionaries to correct for common names, alternative spellings, etc. Does anyone know of anything like this? Any linguists in the group? Besides cunning linguists? I did a few Google searches, but failed to find what I was looking for. Wrong search terms maybe.
I did find this tidbit on EFL (English as a foreign language) website...
Quote:
The fact that one needs to know fewer than 3,000 words in order to understand 80 percent of a reasonably representative modern English text does not mean that this kind of vocabulary could guarantee any of us complete cultural survival in a modern society. You need to realize that many of the most frequently occurring words in English are function words: articles, prepositions, and auxiliary verb forms such as those of "be, have," or "do." The article "the" is by far the most frequently used word in English, occurring 69,975 times in the one-million-word database.
|
Anybody else have any clues?