Suppose you have to install a program on all the computers in a computer lab.
You could just go around to each computer indiviually and install the computer and it would take you about 30 minutes.
You could also design and write a program that would automate the install process by allowing you to command the computers to pull a program to be installed off a server and installing it on themselves. Designing and writing the program would take about 2 hours of research, 25 minutes of coding, and then 30 minutes to install th program you just wrote on all the computers. It then takes 5 minutes to install the first program on all of the computers simultaneously.
So lets see... 30 minutes or 3 hours. Why the fuck would anyone do the 3 hours?
Over the long run, doing 3 hours of (fun) coding work once will save you 25 minutes of repetitive drudgery every time you have to install a new program. About a half-dozen updates later you've come out ahead in terms of time used.
So it's largely a matter of looking at the return on investment over the right timescale. If you study for weeks NOW you might only get +15% on the next test. But if you settle for 80%, you're going to have to make up for that 20% hole in your knowledge at some point... assuming you're actually learning anything useful, of course. If the only time you'll ever use the knowledge is the test itself and never again.. meh, what's the point?
__________________
Simple Machines in Higher Dimensions
|