Saturday 4 June 2011

How has computer technology changed in the last 30 years ?

My ICT homework is to find out how computer technology has changed in the last 30 years. Like whats different that wasnt there 30 years ago. Please help! thanks :)|||I dont get the grade so Im not doing YOUR work...its called look it up|||HAHAHAHA|||Here you go you should get top marks in your project using this info





http://en.wikipedia.org/wiki/Computer|||There are to many to count here, but the most important ones in my opinion are:


1. 30 years ago there were no computers in homes (they were too big)


2. Obviously the internet and it%26#039;s evolution (this is subject on its own - email, social networks, connectivity 24/7365, education in developing countries etc)


3. Computer sizes mean today you can carry almost a fully functional PC in your hand (smartphones, tablets etc)


4. There are tons more but far less important (Printers, digital media and much much more)|||Tracy, that is one heck of a question, as it covers a such a vast number of related technologies.





The main change in computing is capability, ie: what we are capable of achieving or carrying out with computer technology, as it has filtered into many industrial, commercial, educational, scientific and domestic applications within society over the last 30 years





The reasons for these changes again are many, but the main reason is the continual miniaturisation of electronic components which allows more electronics in a smaller space, which also reduces the cost of mass production, giving you more bang for your buck ($). Check out %26quot;Moore%26#039;s Law%26quot; 1970.





This continual miniatrisation has allowed greater innovation and integration of other technologies, thus expanding computing power and the capabilities of computers.





One of the first attachments to the computer was a monitor (VDU), some even used TV%26#039;s by including a uhf decoder (ariel output to TV) to see the results visually, printers were added to gain a hard copy of results, print letters etc. These out output devices. Input devices such as a mouse, scanner, graphics tablet were added. This bolt on attitude to computing has continued throughout, leaving us with the modern computer which will continue to grow in capability as the years pass by.





By connecting the telephone network to the computer, we had computer faxing, the internet was devloped, allowing emails, video conferencing, global banking and other communicational abilities like remote control off site to equipment (the Thames barrier is remotely controlled from India).





Computer development, though not the cause of globalisation, have certainly aided it, allowing us to accomplish far more with greater efficiency on a global scale. This is obvious with the advent of social networkig sites such as bebo, facebook, msn, twitter etc.





There is also an increasing social trend to place more faith and trust in the modern computer as our familiarity with computers increases.





With regards to the technology itself, take a look at the history of the CPU chip. Start with the z80, the 8080, 8086, 286, 386, 486 and pentium chips from Intel





The 6500, 6502, (6510 commodore 64), 6000 series from motorola





The ARM RISC chip and Cambridge Technologies transputer chip (the first true parallel computing chip in the world)





The above mentioned cover the 8 bit, 16 bit going into the 32 bit computer chips. Currently we are using 64 bit and 128 bit architecture in modern computers.





In the early days, we started with hundreds of thousands of instructions per second in the old 8 bit CPU%26#039;s, then we went to MIPS (millions of Instructions Per Second), BIPS (Billions of Instructions Per Second) and the progression will continue.





In recent years with miniaturisation, we have attained much greater speeds, but this has created a problem with overheating. To work around this problem, chip manufacturers limited the speed of the CPU%26#039;s, but have increased the number of CPU cores on each chip to divide the workload and control the heat build-up.





Early processors started about 1 megahertz (.96, .97Mhz), current processors average 3-4 Gigahertz but have multiple CPU cores to overcome the speed limitations.





Hope this has been useful, it should at least give you a starting point in your research.