I was recently reading Michio Kaku's book, "The Physics of the Impossible", which discusses a lot of technologies that we have become used to seeing in modern fiction, including: time travel; matter transportation; faster-than-light travel and UFOs.  It's an interesting read for anyone that has any interest in physics.

One subject that struck me as interesting was about the future of microelectronics.  We are used to chip manufacturers being able to pack more and more transistors into the same tiny space, and to clock devices ever faster.  However, some physicists now think that this age will be over in the next 20 years. Photolithography (using UV light to etch circuits) creates feature sizes of down to 50nm.  So a transistor 50nm wide is the smallest you can make with this technique.  I understood that you can make smaller features using electrons, but 50nm is already quite small, this is only 200 atoms of silicon. 
You can see that we are reaching some fundamental physical limits.  The powerful CPUs we make today create vast amounts of waste heat that does not get used to produce useful work.  The heat problem becomes worse as you wind up the clock speed.

But if you can't add ever more cores to todays CPUs to make them faster, then what is the way forward? Software certainly has a role to play here, as more careful construction of algorithms has potential to make some processes hundreds of times faster.  But developments in software have not kept pace with hardware engineering: although we have new programming languages and better operating systems than 20 years ago, writing code to efficiently use the massively parallel hardware we have built has been a slow process.  Perhaps a slowdown in the development speed of silicon chips is just what the software industry and software sciences need as an incentive to make things better?