With the exception of the tragedy on September 11, the year 2001 was relatively normal and uneventful: remember, this should have been the year of the Clarke’s andKubrick’s Space Odyssey, mission to Juiter; it should have been the year of the HAL-9000 computer. Today, the Personal Computer is as ubiquitous and omnipresent as was HAL on the Discovery spaceship. And the rate of technology development and market growth in electronics industry still follows the famous ‘Moore Law’, almost four decades after it has been first formulated: in 1965, Gordon Moore of Intel Corporation predicted the doubling of the number of transistors on a chip every 2 years, corrected to 18 months in 1967; at that time, the landing on the Moon was in full preparation. Curiously enough, today noone cares to go to the Moon again, let alone Jupiter. And, in spite of all the effort in digital engineering, we still do not have anything close to 0.1% of the HAL capacity (fortunately?!). Whilst there are many research labs striving to put artificial intelligence into a computer, there are also rumors that this has already happened (with Windows-95, of course!).
Optimization of Time-Domain Transient Response Maximization of System Bandwidth Detailed Circuit Analysis, Complete Math Derivations Computer-Aided Design Routines (using Matlab) Practical Know-how