A History major in college, I've always tried to gain insight into the future by looking back at the past. In my role as an industry analyst, that predilection is extremely useful.
Let's face it--trying to predict what will happen in the future is a task fraught with danger, and industry analysts are not always good at it. A decade ago, back when I was working on the vendor side, I remember amusing myself by reading three-year-old Forrester Research reports and seeing how far off their predictions were.
This isn't to say I haven't committed my own howlers. When I wrote the first analyst report on web analytics back in mid-2000, I forecast that the market would have revenues of $425 million in 2000 (up from $141 million in 1999) and enjoy robust growth for years to come. The $425 million number was actually pretty accurate--however, I didn't foresee the impact of the dot.com bust, which caused revenues to remain stagnant for the next several years. The web analytics market certainly didn't hit my predicted mark of $4 billion in 2004. So that mistake has turned into a history lesson for me: take into account outlier scenarios.
I've worked in high tech long enough that it's easy for me to see that new technologies are often old architectures wrapped in a new vocabulary. For example, Ajax is another name for client/server on the web. In fact, client/server itself was a repackaging of an architecture used by the Wang VS minicomputer back in the 1980's, when every VS terminal contained a Z80 chip so that graphics could be performed on the terminal without sending an interrupt back to the central CPU after every keystroke. (In contrast, every terminal keystroke went back to the DEC VAX, which is why Wang always beat DEC in high-volume word processing applications.)