Custom Search
Monday, July 28, 2008
Nokia N93
The Nokia N93 is a smartphone by Nokia especially designed for multimedia use. It was introduced in April 2006 and released in July 2006.
The N93 has improved camera capabilities compared with the earlier N90. The phone has a 3.2 megapixel camera, Carl Zeiss optics and 3x optical zoom as well as a 30 fps VGA-resolution MPEG-4 video recording capability. It was the most advanced camera phone from Nokia at the time of its release.
Features
DVD-like video capture at 30 frames per second in the MPEG-4 format at VGA resolution
3.2 MP camera with Carl Zeiss Vario Tessar optics and flash
320 x 240 pixels, up to 262,144 colors 2.4” display
3 times optical zoom / 20 times digital zoom
direct TV out connectivity
easy video creation and burning to DVD with Adobe Premiere Elements 2.0
digital image stabilization
close up mode
Visual Radio
up to 50MB internal, 2GB mini SD card storage - 90 minutes of "DVD-like" video
Infrared and Bluetooth
WLAN (b and g), 3G (WCDMA 2100 MHz), EDGE and GSM (900/1800/1900 MHz) networks
Java MIDP 2.0
Symbian application support
UPnP (Universal Plug and Play) support
Comes standard with a full Web browser
Fully HW accelerated PowerVR 3D graphics from Imagination Technologies (including OpenGL ES 1.1 and M3G, see JBenchmark)
Push to Talk over Cellular (PoC)
N-Gage compatibile
Nokia N96
Nokia N96 wins your heart with its looks and functionalities. The superior quality phone is stylish and comes in black colour. The dual slide mechanism makes the phone a very smart boy! Its indeed the best mobile of the N series. The features of this new Nokia N96 series phone will simply mesmerise you and you will drop your jaws. The phone has an internal memory of 16GB and when you have such huge memory in your phone you can virtually store the universe in it. You can store all your favorite numbers, videos, and data with such huge memory. The memory is one of the USP of this Nokia N96 phone.
The best part of Nokia N96 is that instead of being a high end phone the features like - SMS, MMS, email, instant messaging, vibration alert, calculator, organiser, personal entries, call record of max 30 days and others make it a all purpose phone. Carrying with such normal downloading of games, music and songs, browsing web is just a child's play now.
Nokia N96 after memory has three more things which make it an ultimate choice- camera, GPS, and fastest Internet connection. The 5 megapixel camera, with auto focus, 20X zoom, and carl zeiss optics gives a new experience of photography. The secondary VGA camera for photocall and video call options is another fun element in this phone.
GPRS Class 32, 107 / 64.2 kbps, HSCSD, EDGE Class 32, 296 kbps, 3G HSDPA and WLAN for Wi-Fi makes internet connectivity a very simple act. Now you can download music, ringtones, data and transfer files with ease and comfort.
The built in AGPS and the integrated GPS allows the user to explore the world around them with a totally new different route. You have the whole city in your palms and you can whirl around the places as your native town.
Besides the above mentioned features the Nokia N96 has many other attributes like TV out, Bluetooth, built in speakers, multi to play MP3, AAC, eAAC+, WMA files which make it a phone for all users.
New Kawasaki Versys in 2008
The new 2008 Kawasaki Versys is a machine which lives in a hard to define sweet spot in the motorcycling cosmos. Is it a practical commuter? Long-legged urban attack vehicle? Sport bike? Light Tourer? The 649cc equivalent twin and the neutral handling and light steering chassis of the Ninja® 650R sport bike made the ideal starting point for this comfortable gridlock commando. A swoopy gull-wing swing arm, long-travel inverted 41mm fork, six-spoke superbike-inspired wheels, adjustable-height windscreen and a contented cockpit were mutual to create a motorcycle that just begs to be ridden wherever the roadway might lead.
The dense liquid-cooled, fuel-injected, DOHC, 8-valve, 649cc parallel twin engine was modified for the Versys motorcycle so it delivers smooth power that’s well-suited for off-the-line acceleration and lithe torque in the cut-n-thrust of a rider’s morning commute. This engine’s overall characteristics are entertaining for even the most advanced riders, yet predictable enough to inspire confidence in less-experienced motorcyclists.
Its holdup offers the next level in sophistication: an advanced Showa rear shock featuring a free piston and two-stage damping valves for progressive compression damping which firms significantly as the shock moves through its stroke. This allows a longer wheel travel with a feel that is at first soft like a dual sport, but firms to near sport bike levels as suspension loads escalate. Of course, this advanced shock is fully-adjustable and connected to a beautiful aluminum gull-wing swingarm that is longer than average, thanks to a short/dense engine and chassis.
Review of Hero Honda Hunk
Hero Honda latest open Hunk, as the name indicates is a manly bike. Hunk is the fourth manufactured goods in the 150cc and above range from Hero Honda with previous ones being 156 cc CBZ Xtreme, 150 cc Achiever and 225 cc Karizma. This machine has almost alike power and torque to CBZ Xtreme, with an engine ability of 14.2bhp at 8500 rpm and 12.8 nm at 6500 rpm. The engine is set to have fall flow.
The bike is obtainable in the market in four colors black, blue, red and silver. The cover of air filter and battery are of black and silver colors and the tail addition has a dual black color painted on it. The engine is also not the typical black in color but it has got silver and gray colors on it. The front forks and gas filled rear have copper and gray colors. The bike has got a big tank and black alloy wheels.
Hunk is very sporty in looks and performance. The look of the bike is definitely a upright one having a brawny styling. The design of the bike is excellent as it almost looks like CBZ Xtreme. Unlike CBZ, the grab rail is single piece in Hunk. The riding position in Hunk is very at ease. The chassis of the Hunk is a tubular diamond shaped. The front brake is a 240 mm diameter disc and the rear one is a 130 mm drum type. The wheelbase is 1325 mm and ground clearance is 145. The front rim is made up of1.85 X 18 of alloy and rear rim is of 2.15 X 18 of alloy .The headlight is of the halogen type having12V 35/35W light and tail light is 12V - 5W .There are 4 turn signal lights and clear lens that are of multi reflector type to help the rider. The weight of the bike is 143 kg for the kick start model and 146 kg for the self start model. The height of the bike is 1095 mm.
The air cooled, 4 stroke and single cylinder, 150cc engine of Hunk is powered with 14.4Ps at 8500rpm and has got 12.80 Nm at 6500 RPM of torque. The displacement of the bike is 149.2 cc. The front suspension is telescopic. A new GRS, 5 step Gas Reservoir Suspension (GRS) is attached to the bike at the rear and 5 spokes are also colored in black. The capacity of tank is 12.4 liters and the usable reserve goes to 2.2 liters.
Hero Honda Hunk will be obtainable in two versions, the kick-start version and self-start and comes with a price tag of 55,000 Indian rupees. The kick start costs Rs 2000 less than the self start Hunk of Hero Honda. The company would charge a premium of Rs 2,000 for the self-start model which is priced at Rs 57,000. It is available in four colors - black, blue, red and silver.
The meter console is the traditional one in Hunk too unlike the latest digital meters in most of the modern bikes. The console is consisted of tripod instrumentation. Fuel gauge, speedometer and rpm meter are analogue. Modern pilot lamps and LED tail lamps fail to find a place in the Hero Honda’s new Hunk. Hunk’s battery is the same 12V-7Ah and the explosion is through AMI(Advanced Microprocessor Ignition).
Review of Bajaj Pulsar 220cc
After introduce the Pulsar 200cc now Bajaj manufactures have launched Pulsar 220cc, the absolutely male among the Indian bikes. The new Pulsar DTS-Fi (Digital Twin Spark-Fuel injected) is the first bike from Bajaj Auto with the powerful grouping of twin spark plugs and fuel-injection technology and is an ultimate machine for the performance motorcycle fan. The latest offering has set new benchmarks in technology, performance, and styling to address the needs of a growing segment of pro-bikers. The 200cc Pulsar is said to be the best looking and handling of the Pulsar Family. However, people have yet to see whether they will age as with no trouble as its predecessors will. Bajaj’s Pulsar DTS-Fi 220cc motorcycle priced at around Rs 85,000.
Special features
In addition to the features mentioned in design section, both these bikes have special features, which increases their practicality. Pulsar 200cc also leaves its drive chain exposed with O’ ring sealed, which is pre-lubricated to help it stay reliable in spite of running open without protection is one such example. Similar to its sibling Pulsar 220cc-soon to be launched, the pulsar 200cc bike throws away the kick lever and makes you to crank the engine only with your thumb unlike the Karizma, which offers both self, and Kick mechanism for cranking. Both these bikes use best instruments and unheard features in any Indian bike. Both Pulsar 200cc and the new Karizma have Digital odo Meter, Digital Fuel Gauge and Digital Trip Meter. Pulsar 200cc features a digital console, which uses only LEDs and intelligently varies its amber backlight intensity for viewing in either day or night.
In 220cc Pulsar the most important feature of all is the fuel injection mechanism, which is accountable for the rapid throttle and the linear power curve. The Bajaj Pulsar DTS-Fi is the first Indian motorcycle with an oil-cooled engine. Together, they make for a potent mixture. The Hero Honda Karizma as well as the Pulsar DTS-Fi weigh 150 kgs, and the extra 3 bhp of the DTS-Fi as well as the power delivery individuality make this an easy Karizma-beater.
Friday, July 25, 2008
Sony Ericsson K810i
The Sony Ericsson K810i is the second latest mobile phone in its class (k series) by Sony Ericsson, released on 2007-05-04. It is an upgraded version of the Sony Ericsson K800i, the main differences being the housing and size of the phone. Like its predecessor, the K810 carries the Cyber-shot branding and has a built-in digital camera capable of taking 3.2 megapixel digital photographs.
Overview
Sony Ericsson K810 is available in "noble blue", "pulse red" and "golden ivory" colours. It retains its predecessor's 3.2 mega-pixel camera with auto focus. As well as visual differences from the K800, the new K810 has new Photo Fix software, for editing photographs.
Camera
The digital camera is able to take photos at four resolutions: 3.2 megapixels (2048x1536), 2 MP (1632x1224), 1 MP (1280x960), and VGA (640x480), with Standard and Fine quality settings at each resolution. Its features are the same as that of K800i:
Four shoot modes - Normal, BestPic, Panorama, Frames
Eight scene presets - Auto, Twilight landscape, Twilight portrait, Landscape, Portrait, Beach/Snow, Sports, Document
Three focus modes - Auto, Macro, Infinite
Three flash settings - Off, Auto, Red-eye reduction
Four special effects which can be applied to the photographs - Black & White, Negative, Sepia, Solarize
Manually adjustable white balance with the following available presets - Automatic, Daylight, Cloudy, Fluorescent, Incandescent
Two metering settings - Normal metering, spot metering
Image and video stabilisation
Xenon flash with AF illuminator assist
It also has a self-timer, which is useful for self-portraits. A light press of the shutter button initiates the auto-focus feature of the camera lens - when a green dot appears on the screen, the camera has focused, and a full press will capture the photograph. The camera also has the ability to capture video clips at 176x144 (QCIF) resolution at 10 frames per second. The camera, which also features 16x digital zoom, is situated on the back side of the phone and is protected by a sliding gloss plastic, metal-effect look cover.
Specifications
Screen
262.144 color TFT
Available colours
Noble Blue
Golden Ivory
Pulse Red
Sizes
106 x 48 x 17 mm
Weight
103 gram
Memory
Memory Stick Micro (M2) supports 8GB(tested by mobile-review.com) but only 4GB is currently available for mass market
Phone memory 64MB
SAR Level
Head 0.61 W/kg; Body 0.42 W/kg; Data 0.77 W/kg
infrared por
Phone Features
3G video calling
Bluetooth
Infrared port
Web Access
Media Player
TrackID
FM Radio
Cyber-Shot Camera
3G high speed internet connection
Expandable Memory
Organizer
JAVA games
Nokia N95
The Nokia N95 (N95-1, internally known as RM-159) is a smartphone produced by Nokia. The N95's functions include those of a camera phone and portable media player, in addition to offering e-mail, web browsing, local Wi-Fi connectivity and text messaging. It is part of the company's Nseries line of smartphones. The N95 runs on Symbian OS v9.2, with an S60 3rd Edition platform with Feature Pack 1. The phone uses a two-way slider to access either media playback buttons or a numeric keypad. Three newer versions have also been released: the upgraded N95 8GB (N95-2), the N95-3, which adds 3G support for North America, and the N95-4 which is an America-targeted version of the N95-2.
Features
Integrated GPS
The N95 contains an integrated GPS receiver which is located below the 0 key on the keypad. The phone ships with Nokia Maps navigation software. Maps are free and can be downloaded either over the air (via a carrier's data packet network) or through the phone's built-in WiFi. Maps can also be downloaded via a PC using the Nokia MapLoader application. To use voice navigation within Nokia Maps a license needs to be purchased from Nokia. Individual city guides are also available for purchase. A-GPS was added later, which greatly improved the initial positioning time.
Multimedia abilities
The N95's dedicated multimedia keys are accessed via the 2-way slider
The N95 is a music player. It supports MP3, WMA, RealAudio, SP-MIDI, AAC+, eAAC+, MIDI, AMR and M4A. Its two-way slider, when slid towards the keypad, allows access to its media playback buttons. A standard 3.5 mm jack is located on the left side of the phone and allows the user to connect any standard headphones to the unit; a user can also use Bluetooth for audio output using A2DP. The device features built-in stereo speakers. The N95 is also capable of playing videos through the included RealPlayer application. Videos can also be played through the TV-out feature. TV-out is a special graphics chip and companion utility that allows users to connect the smartphone using the supplied composite cable to any TV or audiovisual device. Its aim is to let you demo your photos and videos on a large screen, and so as the internet, video games and music features can also be used. The N95's built in UPnP capabilites allow the user to share the phones' digital media over a WLAN network. This provides easy access to the photos, music and videos stored on the phone and also enables media to be downloaded from other UPnP capable devices on the network.
Internet
The N95 has built-in Wi-Fi, with which it can access the Internet (through a 802.11b/g wireless network). The N95 can also connect to the Internet through a carrier packet data network such as UMTS, HSDPA, or EDGE. The web browser displays full web pages as opposed to simplified pages as on most other phones. Web pages may be viewed in portrait or landscape mode and automatic zooming is supported. The N95 also has Bluetooth built in and works with wireless earpieces that use Bluetooth 2.0 technology and for file transfer.
It should be noted that the original N95 does not support U.S. based versions of UMTS/HSDPA; UMTS features in this version of this phone are disabled by default as sold in the US (but can be reactivated if needed). Likewise, the N95 U.S. only supports AT&T's 850/1900 MHz UMTS/HSDPA bands, not the 1700 MHz T-Mobile USA band or the 2100 MHz band used internationally.
The phone can also act as a WAN access point allowing a tethered PC access to a carrier's packet data network. VoIP software and functionality is also included with the phone (though some carriers have opted to remove this feature).
Accelerometer
The N95 includes a built-in accelerometer. This was originally only used for video stabilization and photo orientation (to keep landscape or portrait shots oriented as taken).
Nokia Research Center has recently allowed an application interface directly to the accelerometer, allowing software to use the data from it. Nokia has released an application to demonstrate this.[11] [12]
Third-party programs have already begun to appear, including software that will automatically change the screen orientation when the phone is tilted, programs that simulate the sounds of a Star Wars lightsaber[13] when the phone is waved through the air, allow you to mute the phone by turning it face-down, and many more.
Tuesday, July 8, 2008
Motorola 6800
The 6800 is an 8-bit microprocessor produced by Motorola and released shortly after the Intel 8080 in late 1974.It had 78 instructions, including the (in)famous, undocumented Halt and Catch Fire (HCF) bus test instruction.It may have been the first microprocessor with an index register.
It was usually packaged in a 40 pin DIP (dual-inline package).
Several first-generation microcomputers of the 1970s, available by mail order as kits or in assembled form, used the 6800 as their CPU; examples are the MEK6800D2 development board, the SWTPC 6800 (the first computer to use the 6800), the MITS Altair 680 range (MITS offered these as alternatives to its Altair 8800 which used the Intel 8080), several of the Ohio Scientific designs, Gimix, Smoke Signal Broadcasting, Midwest Scientific, and the Newbear 77/68.
The 4051, a professional grade desktop graphical system intended for user programming in BASIC, was manufactured and sold by Tektronix. This integrated a 6800 processor, memory card, storage display tube, keyboard, and magnetic tape cassette in a single unit and employed an external thermal imaging printer for hard copy.
The 6800 'fathered' several descendants, the pinnacle being the greatly extended and semi-compatible 6809, which was used in the Vectrex video game console and the TRS-80 Color Computer, among several others. There are also many microcontrollers descended from the 6800 architecture, such as the Motorola 6801/6803, 6805, RS08, 68HC08, 68HC11 and 68HC12.
Hitachi, Ltd. acted as a second source for many of Motorola's CPUs, and also produced its own derivatives including the 6301 and 6303, which could run 6800 code. These microprocessors also had a couple of extra instructions added to their instruction sets.
Competitor MOS Technology came up with an architectural relative of the 6800, with its 6502 ('lawsuit compatible' MPU) and its successors. The 6502 did not have the 16 bit registers of the 6800, but had more addressing modes and was substantially cheaper. The 6502 was used in many computers and game consoles during the late 1970s and early-to-mid-1980s (most notably the Atari 2600, Apple II, the Commodore PET, VIC-20 and Commodore 64, the Acorn Electron/BBC Microcomputer, and the Nintendo Entertainment System/NES).
The 6800 was supplanted by the Motorola 68000, used in large numbers in the Apple Macintosh family before the introduction of the PowerPC, a RISC technology developed by IBM and produced jointly with Motorola
It was usually packaged in a 40 pin DIP (dual-inline package).
Several first-generation microcomputers of the 1970s, available by mail order as kits or in assembled form, used the 6800 as their CPU; examples are the MEK6800D2 development board, the SWTPC 6800 (the first computer to use the 6800), the MITS Altair 680 range (MITS offered these as alternatives to its Altair 8800 which used the Intel 8080), several of the Ohio Scientific designs, Gimix, Smoke Signal Broadcasting, Midwest Scientific, and the Newbear 77/68.
The 4051, a professional grade desktop graphical system intended for user programming in BASIC, was manufactured and sold by Tektronix. This integrated a 6800 processor, memory card, storage display tube, keyboard, and magnetic tape cassette in a single unit and employed an external thermal imaging printer for hard copy.
The 6800 'fathered' several descendants, the pinnacle being the greatly extended and semi-compatible 6809, which was used in the Vectrex video game console and the TRS-80 Color Computer, among several others. There are also many microcontrollers descended from the 6800 architecture, such as the Motorola 6801/6803, 6805, RS08, 68HC08, 68HC11 and 68HC12.
Hitachi, Ltd. acted as a second source for many of Motorola's CPUs, and also produced its own derivatives including the 6301 and 6303, which could run 6800 code. These microprocessors also had a couple of extra instructions added to their instruction sets.
Competitor MOS Technology came up with an architectural relative of the 6800, with its 6502 ('lawsuit compatible' MPU) and its successors. The 6502 did not have the 16 bit registers of the 6800, but had more addressing modes and was substantially cheaper. The 6502 was used in many computers and game consoles during the late 1970s and early-to-mid-1980s (most notably the Atari 2600, Apple II, the Commodore PET, VIC-20 and Commodore 64, the Acorn Electron/BBC Microcomputer, and the Nintendo Entertainment System/NES).
The 6800 was supplanted by the Motorola 68000, used in large numbers in the Apple Macintosh family before the introduction of the PowerPC, a RISC technology developed by IBM and produced jointly with Motorola
Intel 4040
Produced : From 1974 to 1981[1]
Manufacturer : Intel
Max CPU clock : 500 kHz to 740 kHz
Instruction set : 4-bit BCD oriented
Package : 24 pin DIP
Atanasoff–Berry Computer
The Atanasoff–Berry Computer (ABC) was the first electronic digital computing device.Conceived in 1937, the machine was not programmable, being designed only to solve systems of linear equations. It was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was unreliable, and when Atanasoff left Iowa State University for World War II assignments, work on the machine was discontinued.The ABC pioneered important elements of modern computing, including binary arithmetic and electronic switching elements, but its special-purpose nature and lack of a changeable, stored program distinguish it from modern computers.
John Vincent Atanasoff's and Clifford Berry's computer work was not widely known until it was rediscovered in the 1960s, amidst conflicting claims about the first instance of an electronic computer. The ENIAC computer was considered to be the first computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and concluded that the ABC was the first "computer"
John Vincent Atanasoff's and Clifford Berry's computer work was not widely known until it was rediscovered in the 1960s, amidst conflicting claims about the first instance of an electronic computer. The ENIAC computer was considered to be the first computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and concluded that the ABC was the first "computer"
The Norden bombsight sight head.
The Norden bombsight was a bombsight used by the United States Army Air Force during World War II, the Korean War, and the Vietnam War to aid the crew of bomber aircraft in dropping bombs accurately. Its operation was a closely guarded secret of World War II.
Operational efficiency
The Norden was developed during a period of United States non-interventionism when the dominant US military strategy was the defense of the United States and its possessions. A considerable amount of the US's strategy was based on stopping attempted attacks at sea, both with direct Naval power, and starting in the 1930s, with US Army Air Force airpower.
Airpower had been coming into its own as an anti-shipping weapon, but hitting a moving ship at sea was a difficult task. Most forces of the era invested heavily in dive bombers or torpedo bombers, but these generally had limited range and were only suitable in a strategic sense for carrier basing. The Army instead invested in the combination of the Norden and B-17, which it was believed would have enough accuracy to allow formations of B-17s to successfully attack shipping at long distances from the USAAF's land bases. Using the Norden, bombardiers could, in theory, drop their bombs within a 100 foot (ca 30 m) circle from an altitude of well over 20,000 feet (ca. 7 km). The high altitude would allow for long cruising ranges and keep them out of range of most ship-borne anti-aircraft fire while the bomb pattern would still give an acceptable probability of a "hit". The Norden was marketed as the tool to win the war; and it was often claimed that the bombsight could drop bombs into pickle barrels.
In practice the Norden never managed to produce accuracies remotely like those it was theoretically capable of. The RAF were the first to use the B-17 in combat, and reported extremely poor results, eventually converting their aircraft to other duties. USAAF anti-shipping operations in the far east were likewise generally unsuccessful, and although there were numerous claims of sinkings, the only confirmed successful action was during the Battle of the Philippines when B-17s damaged two Japanese transports, the cruiser Naka, and the destroyer Murasame, and sank one minesweeper. However these successes were the exception to the rule; actions during the Battle of Coral Sea or Battle of Midway, for instance, were entirely unsuccessful. The USAAF eventually replaced all of their B-17s with other aircraft, and came to use the skip bombing technique in direct low-level attacks.
In Europe the Norden likewise demonstrated a poor real-world accuracy. Under perfect conditions only 50 percent of American bombs fell within a quarter of a mile of the target, and American flyers estimated that as many as 90 percent of bombs could miss their targets.[1][2][3] Nevertheless many veteran B-17 and B-24 bombardiers swore by the Norden.
Many factors have been put forth to explain the Norden's poor performance. Over Europe the cloud cover was a common explanation, although performance did not improve even in favorable conditions. Accuracy did improve with the introduction of the "master bomber" concept, under which only a single aircraft would actually use the Norden while the rest simply dropped on their command. This suggests that much of the problem is attributable to the bombardier. Over Japan, bomber crews soon discovered strong winds at high altitudes, the so-called jetstreams, but the Norden bombsight worked only for wind speeds with minimal wind shear. Additionally, the bombing altitude over Japan reached up to 30,000 feet (9,100 m), but most of the testing had been done well below 20,000 ft (6,100 m) An additional factor was that the shape and even the paint of the bomb mantle greatly changed the aerodynamic properties of the weapon; and, at that time, nobody knew how to calculate the trajectory of bombs that reached supersonic speeds during their fall.
In both theaters of war, one vulnerability was that when the bombardier auto-piloted the aircraft using the bombsight, the aircraft was more susceptible to anti-aircraft fire and collisions with other allied aircraft.
As a mechanical device, the Norden bombsight used complex machinery consisting of many gearwheels and ball bearings, which were prone to produce inaccuracies if not properly maintained. In fact, many bombsights were rushed to war use without thorough testing. Often the bombardier had to oil and repair failures himself. For some time into the war equipped and qualified groundcrew technical staff were simply not available in sufficient numbers
Airpower had been coming into its own as an anti-shipping weapon, but hitting a moving ship at sea was a difficult task. Most forces of the era invested heavily in dive bombers or torpedo bombers, but these generally had limited range and were only suitable in a strategic sense for carrier basing. The Army instead invested in the combination of the Norden and B-17, which it was believed would have enough accuracy to allow formations of B-17s to successfully attack shipping at long distances from the USAAF's land bases. Using the Norden, bombardiers could, in theory, drop their bombs within a 100 foot (ca 30 m) circle from an altitude of well over 20,000 feet (ca. 7 km). The high altitude would allow for long cruising ranges and keep them out of range of most ship-borne anti-aircraft fire while the bomb pattern would still give an acceptable probability of a "hit". The Norden was marketed as the tool to win the war; and it was often claimed that the bombsight could drop bombs into pickle barrels.
In practice the Norden never managed to produce accuracies remotely like those it was theoretically capable of. The RAF were the first to use the B-17 in combat, and reported extremely poor results, eventually converting their aircraft to other duties. USAAF anti-shipping operations in the far east were likewise generally unsuccessful, and although there were numerous claims of sinkings, the only confirmed successful action was during the Battle of the Philippines when B-17s damaged two Japanese transports, the cruiser Naka, and the destroyer Murasame, and sank one minesweeper. However these successes were the exception to the rule; actions during the Battle of Coral Sea or Battle of Midway, for instance, were entirely unsuccessful. The USAAF eventually replaced all of their B-17s with other aircraft, and came to use the skip bombing technique in direct low-level attacks.
In Europe the Norden likewise demonstrated a poor real-world accuracy. Under perfect conditions only 50 percent of American bombs fell within a quarter of a mile of the target, and American flyers estimated that as many as 90 percent of bombs could miss their targets.[1][2][3] Nevertheless many veteran B-17 and B-24 bombardiers swore by the Norden.
Many factors have been put forth to explain the Norden's poor performance. Over Europe the cloud cover was a common explanation, although performance did not improve even in favorable conditions. Accuracy did improve with the introduction of the "master bomber" concept, under which only a single aircraft would actually use the Norden while the rest simply dropped on their command. This suggests that much of the problem is attributable to the bombardier. Over Japan, bomber crews soon discovered strong winds at high altitudes, the so-called jetstreams, but the Norden bombsight worked only for wind speeds with minimal wind shear. Additionally, the bombing altitude over Japan reached up to 30,000 feet (9,100 m), but most of the testing had been done well below 20,000 ft (6,100 m) An additional factor was that the shape and even the paint of the bomb mantle greatly changed the aerodynamic properties of the weapon; and, at that time, nobody knew how to calculate the trajectory of bombs that reached supersonic speeds during their fall.
In both theaters of war, one vulnerability was that when the bombardier auto-piloted the aircraft using the bombsight, the aircraft was more susceptible to anti-aircraft fire and collisions with other allied aircraft.
As a mechanical device, the Norden bombsight used complex machinery consisting of many gearwheels and ball bearings, which were prone to produce inaccuracies if not properly maintained. In fact, many bombsights were rushed to war use without thorough testing. Often the bombardier had to oil and repair failures himself. For some time into the war equipped and qualified groundcrew technical staff were simply not available in sufficient numbers
Difference engine
J.H. Müller, an engineer in the Hessian army conceived the idea in a book published in 1786, but failed to find funding to progress this further.
In 1822, Charles Babbage proposed the use of such a machine in a paper to the Royal Astronomical Society on 14 June entitled "Note on the application of machinery to the computation of very big mathematical tables." This machine used the decimal number system and was powered by cranking a handle. The British government initially financed the project, but withdrew funding when Babbage repeatedly asked for more money whilst making no apparent progress on building the machine. Babbage went on to design his much more general analytical engine but later produced an improved difference engine design (his "Difference Engine No. 2") between 1847 and 1849. Inspired by Babbage's difference engine plans, Per Georg Scheutz built several difference engines from 1855 onwards; one was sold to the British government in 1859. Martin Wiberg improved Scheutz's construction but used his device only for producing and publishing printed logarithmic tables.[citation needed]
Based on Babbage's original plans, the London Science Museum constructed a working Difference Engine No. 2 from 1989 to 1991, under Doron Swade, the then Curator of Computing. This was to celebrate the 200th anniversary of Babbage's birth. In 2000, the printer which Babbage originally designed for the difference engine was also completed. The conversion of the original design drawings into drawings suitable for engineering manufacturers' use revealed some minor errors in Babbage's design, which had to be corrected. Once completed, both the engine and its printer worked flawlessly, and still do. The difference engine and printer were constructed to tolerances achievable with 19th century technology, resolving a long-standing debate whether Babbage's design would actually have worked. (One of the reasons formerly advanced for the non-completion of Babbage's engines had been that engineering methods were insufficiently developed in the Victorian era.) In addition to funding the construction of the output mechanism for the Science Museum's Difference Engine No. 2, Nathan Myhrvold commissioned the construction of a second complete Difference Engine No. 2, which will be on exhibit at the Computer History Museum in Mountain View, California from 10 May 2008 through April 2009.
Antikythera mechanism
The Antikythera mechanism is an ancient mechanical calculator (also described as the first "mechanical computer"[1][2]) designed to calculate astronomical positions. It was discovered in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, in 1900. Subsequent investigation, particularly in 2006, dated it to about 150-100 BC, and hypothesised that it was on board a ship that sank en route from the Greek island of Rhodes to Rome, perhaps as part of an official loot. Technological artifacts of similar complexity did not reappear until a thousand years later.
Networking and the Internet
Computers have been used to coordinate information between multiple locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems like Sabre.
In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. This effort was funded by ARPA (now DARPA), and the computer network that it produced was called the ARPANET. The technologies that made the Arpanet possible spread and evolved. In time, the network spread beyond academic and military institutions and became known as the Internet. The emergence of networking involved a redefinition of the nature and boundaries of the computer. Computer operating systems and applications were modified to include the ability to define and access the resources of other computers on the network, such as peripheral devices, stored information, and the like, as extensions of the resources of an individual computer. Initially these facilities were available primarily to people working in high-tech environments, but in the 1990s the spread of applications like e-mail and the World Wide Web, combined with the development of cheap, fast networking technologies like Ethernet and ADSL saw computer networking become almost ubiquitous. In fact, the number of computers that are networked is growing phenomenally. A very large proportion of personal computers regularly connect to the Internet to communicate and receive information. "Wireless" networking, often utilizing mobile phone networks, has meant networking is becoming increasingly ubiquitous even in mobile computing environments.
In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. This effort was funded by ARPA (now DARPA), and the computer network that it produced was called the ARPANET. The technologies that made the Arpanet possible spread and evolved. In time, the network spread beyond academic and military institutions and became known as the Internet. The emergence of networking involved a redefinition of the nature and boundaries of the computer. Computer operating systems and applications were modified to include the ability to define and access the resources of other computers on the network, such as peripheral devices, stored information, and the like, as extensions of the resources of an individual computer. Initially these facilities were available primarily to people working in high-tech environments, but in the 1990s the spread of applications like e-mail and the World Wide Web, combined with the development of cheap, fast networking technologies like Ethernet and ADSL saw computer networking become almost ubiquitous. In fact, the number of computers that are networked is growing phenomenally. A very large proportion of personal computers regularly connect to the Internet to communicate and receive information. "Wireless" networking, often utilizing mobile phone networks, has meant networking is becoming increasingly ubiquitous even in mobile computing environments.
1970s punched card
A punch card or punched card (or punchcard or Hollerith card or IBM card), is a piece of stiff paper that contains digital information represented by the presence or absence of holes in predefined positions. Now almost an obsolete recording medium, punched cards were widely used throughout the 19th century for controlling textile looms and in the late 19th and early 20th century for operating fairground organs and related instruments. It was used through the 20th century in unit record machines for input, processing, and data storage. Early digital computers used punched cards as the primary medium for input of both computer programs and data, with offline data entry on key punch machines. Some voting machines use punched cards.
Microprocessors
A microprocessor incorporates most or all of the functions of a central processing unit (CPU) on a single integrated circuit (IC). [1] The first microprocessors emerged in the early 1970s and were used for electronic calculators, using BCD arithmetics on 4-bit words. Other embedded uses of 4 and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc, followed rather quickly. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general purpose microcomputers in the mid-1970s.
Processors were for a long period constructed out of small and medium-scale ICs containing the equivalent of a few to a few hundred transistors. The integration of the whole CPU onto a single VLSI chip therefore greatly reduced the cost of processing capacity. From their humble beginnings, continued increases in microprocessor capacity have rendered other forms of computers almost completely obsolete (see history of computing hardware), with one or more microprocessor as processing element in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers.
Since the early 1970s, the increase in processing capacity of evolving microprocessors has been known to generally follow Moore's Law. It suggests that the complexity of an integrated circuit, with respect to minimum component cost, doubles every 18 months. In the late 1990s, heat generation (TDP), due to current leakage and other factors, emerged as a leading developmental constraint
Processors were for a long period constructed out of small and medium-scale ICs containing the equivalent of a few to a few hundred transistors. The integration of the whole CPU onto a single VLSI chip therefore greatly reduced the cost of processing capacity. From their humble beginnings, continued increases in microprocessor capacity have rendered other forms of computers almost completely obsolete (see history of computing hardware), with one or more microprocessor as processing element in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers.
Since the early 1970s, the increase in processing capacity of evolving microprocessors has been known to generally follow Moore's Law. It suggests that the complexity of an integrated circuit, with respect to minimum component cost, doubles every 18 months. In the late 1990s, heat generation (TDP), due to current leakage and other factors, emerged as a leading developmental constraint
EDSAC
A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult (Shannon 1940). Notable achievements include:
EDSAC was one of the first computers to implement the stored program (von Neumann) architecture.Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer. The non-programmable Atanasoff–Berry Computer (1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory. The secret British Colossus computers (1943)[5], which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes. The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability. The U.S. Army's Ballistics Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming. Several developers of ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which came to be known as the stored program architecture or von Neumann architecture. This design was first formally described by John von Neumann in the paper "First Draft of a Report on the EDVAC", published in 1945. A number of projects to develop computers based on the stored program architecture commenced around this time, the first of these being completed in Great Britain. The first to be demonstrated working was the Manchester Small-Scale Experimental Machine (SSEM) or "Baby". However, the EDSAC, completed a year after SSEM, was perhaps the first practical implementation of the stored program design. Shortly thereafter, the machine originally described by von Neumann's paper—EDVAC—was completed but did not see full-time use for an additional two years.
Nearly all modern computers implement some form of the stored program architecture, making it the single trait by which the word "computer" is now defined. By this standard, many earlier devices would no longer be called computers by today's definition, but are usually referred to as such in their historical context. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture. The design made the universal computer a practical reality.
Microprocessors are miniaturized devices that often implement stored program CPUs.Vacuum tube-based computers were in use throughout the 1950s. Vacuum tubes were largely replaced in the 1960s by transistor-based computers. When compared with tubes, transistors are smaller, faster, cheaper, use less power, and are more reliable. In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, caused another generation of decreased size and cost, and another generation of increased speed and reliability. By the 1980s, computers became sufficiently small and cheap to replace simple mechanical controls in domestic appliances such as washing machines. The 1980s also witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household.
EDSAC was one of the first computers to implement the stored program (von Neumann) architecture.Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer. The non-programmable Atanasoff–Berry Computer (1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory. The secret British Colossus computers (1943)[5], which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes. The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability. The U.S. Army's Ballistics Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming. Several developers of ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which came to be known as the stored program architecture or von Neumann architecture. This design was first formally described by John von Neumann in the paper "First Draft of a Report on the EDVAC", published in 1945. A number of projects to develop computers based on the stored program architecture commenced around this time, the first of these being completed in Great Britain. The first to be demonstrated working was the Manchester Small-Scale Experimental Machine (SSEM) or "Baby". However, the EDSAC, completed a year after SSEM, was perhaps the first practical implementation of the stored program design. Shortly thereafter, the machine originally described by von Neumann's paper—EDVAC—was completed but did not see full-time use for an additional two years.
Nearly all modern computers implement some form of the stored program architecture, making it the single trait by which the word "computer" is now defined. By this standard, many earlier devices would no longer be called computers by today's definition, but are usually referred to as such in their historical context. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture. The design made the universal computer a practical reality.
Microprocessors are miniaturized devices that often implement stored program CPUs.Vacuum tube-based computers were in use throughout the 1950s. Vacuum tubes were largely replaced in the 1960s by transistor-based computers. When compared with tubes, transistors are smaller, faster, cheaper, use less power, and are more reliable. In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, caused another generation of decreased size and cost, and another generation of increased speed and reliability. By the 1980s, computers became sufficiently small and cheap to replace simple mechanical controls in domestic appliances such as washing machines. The 1980s also witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household.
Jacquard loom
History of computing
The Jacquard loom was one of the first programmable devices.It is difficult to identify any one device as the earliest computer, partly because the term "computer" has been subject to varying interpretations over time. Originally, the term "computer" referred to a person who performed numerical calculations (a human computer), often with the aid of a mechanical calculating device.
The history of the modern computer begins with two separate technologies - that of automated calculation and that of programmability.
Examples of early mechanical calculating devices included the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC). The end of the Middle Ages saw a re-invigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers. However, none of those devices fit the modern definition of a computer because they could not be programmed.
Hero of Alexandria (c. 10 – 70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions - and when.[3] This is the essence of programmability. In 1801, Joseph Marie Jacquard made an improvement to the textile loom that used a series of punched paper cards as a template to allow his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.
It was the fusion of automatic calculation with programmability that produced the first recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer that he called "The Analytical Engine".[4] Due to limited finances, and an inability to resist tinkering with the design, Babbage never actually built his Analytical Engine.
Large-scale automated data processing of punched cards was performed for the U.S. Census in 1890 by tabulating machines designed by Herman Hollerith and manufactured by the Computing Tabulating Recording Corporation, which later became IBM. By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter.
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.
The history of the modern computer begins with two separate technologies - that of automated calculation and that of programmability.
Examples of early mechanical calculating devices included the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC). The end of the Middle Ages saw a re-invigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers. However, none of those devices fit the modern definition of a computer because they could not be programmed.
Hero of Alexandria (c. 10 – 70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions - and when.[3] This is the essence of programmability. In 1801, Joseph Marie Jacquard made an improvement to the textile loom that used a series of punched paper cards as a template to allow his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.
It was the fusion of automatic calculation with programmability that produced the first recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer that he called "The Analytical Engine".[4] Due to limited finances, and an inability to resist tinkering with the design, Babbage never actually built his Analytical Engine.
Large-scale automated data processing of punched cards was performed for the U.S. Census in 1890 by tabulating machines designed by Herman Hollerith and manufactured by the Computing Tabulating Recording Corporation, which later became IBM. By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter.
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.
Computer
A computer is a machine that manipulates data according to a list of instructions.
The first devices that resemble modern computers date to the mid-20th century (around 1940 - 1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers.Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space.Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers, in various forms, are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.
The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.
The first devices that resemble modern computers date to the mid-20th century (around 1940 - 1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers.Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space.Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers, in various forms, are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.
The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.
Ceiling-smoke-alarm.
Nuclear technology
It is a technology that involves the reactions of atomic nuclei. It has found applications from smoke detectors to nuclear reactors, and from gun sights to nuclear weapons. There is a great deal of public concern about its possible implications, and every application of nuclear technology is reviewed with care.
History
Discovery
In 1896, Henri Becquerel was investigating phosphorescence in uranium salts when he discovered a new phenomenon which came to be called radioactivity.[1] He, Pierre Curie and Marie Curie began investigating the phenomenon. In the process they isolated the element radium, which is highly radioactive. They discovered that radioactive materials produce intense, penetrating rays of several distinct sorts, which they called alpha rays, beta rays and gamma rays. Some of these kinds of radiation could pass through ordinary matter, and all of them could cause damage in large amounts - all the early researchers received various radiation burns, much like sunburn, and thought little of it.
The new phenomenon of radioactivity was seized upon by the manufacturers of quack medicine (as had the discoveries of electricity and magnetism, earlier), and any number of patent medicines and treatments involving radioactivity were put forward. Gradually it came to be realized that the radiation produced by radioactive decay was ionizing radiation, and that quantities too small to burn presented a severe long-term hazard. Many of the scientists working on radioactivity died of cancer as a result of their exposure. Radioactive patent medicines mostly disappeared, but other applications of radioactive materials persisted, such as the use of radium salts to produce glowing dials on meters.
As the atom came to be better understood, the nature of radioactivity became clearer; some atomic nuclei are unstable, and can decay releasing energy (in the form of: gamma rays, high-energy photons); (alpha particles, a pair of protons and a pair of neutrons; and beta particles, high-energy electrons).
The new phenomenon of radioactivity was seized upon by the manufacturers of quack medicine (as had the discoveries of electricity and magnetism, earlier), and any number of patent medicines and treatments involving radioactivity were put forward. Gradually it came to be realized that the radiation produced by radioactive decay was ionizing radiation, and that quantities too small to burn presented a severe long-term hazard. Many of the scientists working on radioactivity died of cancer as a result of their exposure. Radioactive patent medicines mostly disappeared, but other applications of radioactive materials persisted, such as the use of radium salts to produce glowing dials on meters.
As the atom came to be better understood, the nature of radioactivity became clearer; some atomic nuclei are unstable, and can decay releasing energy (in the form of: gamma rays, high-energy photons); (alpha particles, a pair of protons and a pair of neutrons; and beta particles, high-energy electrons).
Computing
Computing is the activity of developing and using computer technology, including computer hardware and software. It is the computer-specific part of information technology. Computer science is the study of the theoretical foundations of computing and the application of the theories in computing.
Computing Curricula 2005[1] defined computing:
In a general way, we can define computing to mean any goal-oriented activity requiring, benefiting from, or creating computers. Thus, computing includes designing and building hardware and software systems for a wide range of purposes; processing, structuring, and managing various kinds of information; doing scientific studies using computers; making computer systems behave intelligently; creating and using communications and entertainment media; finding and gathering information relevant to any particular purpose, and so on. The list is virtually endless, and the possibilities are vast.
Computing Curricula 2005[1] defined computing:
In a general way, we can define computing to mean any goal-oriented activity requiring, benefiting from, or creating computers. Thus, computing includes designing and building hardware and software systems for a wide range of purposes; processing, structuring, and managing various kinds of information; doing scientific studies using computers; making computer systems behave intelligently; creating and using communications and entertainment media; finding and gathering information relevant to any particular purpose, and so on. The list is virtually endless, and the possibilities are vast.
Definitions
The term computing has sometimes been narrowly defined, as in a 1989 ACM report on Computing as a Discipline:
The discipline of computing is the systematic study of algorithmic processes that describe and transform information: their theory, analysis, design, efficiency, implementation, and application. The fundamental question underlying all computing is 'What can be (efficiently) automated?'
Computing Curricula 2005 also recognizes that the meaning of computing depends on the context:
Computing also has other meanings that are more specific, based on the context in which the term is used. For example, an information systems specialist will view computing somewhat differently from a software engineer. Regardless of the context, doing computing well can be complicated and difficult. Because society needs people to do computing well, we must think of computing not only as a profession but also as a discipline.
The term computing is also synonymous with counting and calculating. In earlier times it was used in reference to mechanical computing machines.
The discipline of computing is the systematic study of algorithmic processes that describe and transform information: their theory, analysis, design, efficiency, implementation, and application. The fundamental question underlying all computing is 'What can be (efficiently) automated?'
Computing Curricula 2005 also recognizes that the meaning of computing depends on the context:
Computing also has other meanings that are more specific, based on the context in which the term is used. For example, an information systems specialist will view computing somewhat differently from a software engineer. Regardless of the context, doing computing well can be complicated and difficult. Because society needs people to do computing well, we must think of computing not only as a profession but also as a discipline.
The term computing is also synonymous with counting and calculating. In earlier times it was used in reference to mechanical computing machines.
Science and theory
1.Computer science
2.Theory of computation
3.Computational models
4.Digital Bibliography & Library Project, as of July 2007, lists over 910 000 bibliographic entries on computer science and several thousand links to the home pages of computer scientists
5.Scientific computing
6.Metacomputing
Hardware
1.Computer
2.Computer hardware
3.Computer Hardware Design
4.Computer network
5.Computer system
6.History of computing hardware
1.Computer
2.Computer hardware
3.Computer Hardware Design
4.Computer network
5.Computer system
6.History of computing hardware
Software
1.Software engineering
2.Computer programming
3.Computational
Software patent
4.Firmware
5.Operating systems
6.Application Software
7.Databases
8.Geographic information system
9.Spreadsheet
10.Word processor
11.Programming languages
12.interpreters
13.compilers
14.Speech recognition
1.Software engineering
2.Computer programming
3.Computational
Software patent
4.Firmware
5.Operating systems
6.Application Software
7.Databases
8.Geographic information system
9.Spreadsheet
10.Word processor
11.Programming languages
12.interpreters
13.compilers
14.Speech recognition
Technology
Technology is a broad concept that deals with a species' usage and knowledge of tools and crafts, and how it affects a species' ability to control and adapt to its environment. In human society, it is a consequence of science and engineering, although several technological advances predate the two concepts. Technology is a term with origins in the Greek "technologia", "τεχνολογία" — "techne", "τέχνη" ("craft") and "logia", "λογία" ("saying").[1] However, a strict definition is elusive; "technology" can refer to material objects of use to humanity, such as machines, hardware or utensils, but can also encompass broader themes, including systems, methods of organization, and techniques. The term can either be applied generally or to specific areas: examples include "construction technology", "medical technology", or "state-of-the-art technology".
The human race's use of technology began with the conversion of natural resources into simple tools. The prehistorical discovery of the ability to control fire increased the available sources of food and the invention of the wheel helped humans in travelling in and controlling their environment. Recent technological developments, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact on a global scale. However, not all technology has been used for peaceful purposes; the development of weapons of ever-increasing destructive power has progressed throughout history, from clubs to nuclear weapons.
Technology has affected society and its surroundings in a number of ways. In many societies, technology has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of the Earth and its environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.
Philosophical debates have arisen over the present and future use of technology in society, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar movements criticise the pervasiveness of technology in the modern world, claiming that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition. Indeed, until recently, it was believed that the development of technology was restricted only to human beings, but recent scientific studies indicate that other primates and certain dolphin communities have developed simple tools and learned to pass their knowledge to other generations.
Contents [hide]1 Definition and usage 2 Science, engineering and technology 3 Role in human history 3.1 Paleolithic (2.5 million – 10,000 BC) 3.1.1 Stone tools 3.1.2 Fire 3.1.3 Clothing and shelter 3.2 Neolithic through Classical Antiquity (10,000BCE – 300CE) 3.2.1 Metal tools 3.2.2 Energy and Transport 3.3 Modern history (0CE —) 4 Technology and philosophy 4.1 Technicism 4.2 Optimism 4.3 Pessimism 4.4 Appropriate technology 5 Other species 6 See also 6.1 Theories and concepts in technology 6.2 Economics of technology 7 Notes 8 References 9 Further reading
Definition and usage The invention of the printing press made it possible for scientists and politicians to communicate their ideas with ease, leading to the Age of Enlightenment; an example of technology as a cultural force.In general technology is the relationship that society has with its tools and crafts, and to what extent society can control its environment. The Merriam-Webster dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge".Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here".The term is often used to imply a specific field of technology, or to refer to high technology, rather than technology as a whole.Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter."
Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.
The word "technology" can also be used to refer to a collection of techniques. In this context, it is the current state of humanity's knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as "medical technology" or "space technology", it refers to the state of the respective field's knowledge and tools. "State-of-the-art technology" refers to the high technology available to humanity in any field.
Technology can be viewed as an activity that forms or changes culture.Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and, as a result, has helped spawn new subcultures; the rise of cyberculture has, at its basis, the development of the Internet and the computer.Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.
Science, engineering and technologyThe distinction between science, engineering and technology is not always clear. Science is the reasoned investigation or study of phenomena, aimed at discovering enduring principles among elements of the phenomenal world by employing formal techniques such as the scientific method.Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability and safety.
Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.
Technology is often a consequence of science and engineering — although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors, by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines, such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.
Role in human historyMain articles: History of technology and Timeline of invention
Paleolithic (2.5 million – 10,000 BC) A primitive chopperThe use of tools by early humans was partly a process of discovery, partly of evolution. Early humans evolved from a race of foraging hominids which were already bipedal, with a brain mass approximately one third that of modern humans.Tool use remained relatively unchanged for most of early human history, but approximately 50,000 years ago, a complex set of behaviors and tool use emerged, believed by many archaeologists to be connected to the emergence of fully-modern language.
Stone tools Hand axes from the Acheulian period A Clovis point, made via pressure flakingHuman ancestors have been using stone and other tools since long before the emergence of Homo sapiens approximately 200,000 years ago.The earliest methods of stone tool making, known as the Oldowan "industry", date back to at least 2.3 million years ago,with the earliest direct evidence of tool usage found in Ethiopia within the Great Rift Valley, dating back to 2.5 million years ago.This era of stone tool use is called the Paleolithic, or "Old stone age", and spans all of human history up to the development of agriculture approximately 12,000 years ago.
To make a stone tool, a "core" of hard stone with specific flaking properties (such as flint) was struck with a hammerstone. This flaking produced a sharp edge on the core stone as well as on the flakes, either of which could be used as tools, primarily in the form of choppers or scrapers.These tools greatly aided the early humans in their hunter-gatherer lifestyle to perform a variety of tasks including butchering carcasses (and breaking bones to get at the marrow); chopping wood; cracking open nuts; skinning an animal for its hide; and even forming other tools out of softer materials such as bone and wood.
The earliest stone tools were crude, being little more than a fractured rock. In the Acheulian era, beginning approximately 1.65 million years ago, methods of working these stone into specific shapes, such as hand axes emerged. The Middle Paleolithic, approximately 300,000 years ago, saw the introduction of the prepared-core technique, where multiple blades could be rapidly formed from a single core stone.The Upper Paleolithic, beginning approximately 40,000 years ago, saw the introduction of pressure flaking, where a wood, bone, or antler punch could be used to shape a stone very finely.
FireThe discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1,000,000 BCE; scholarly consensus indicates that Homo erectus had controlled fire by between 500,000 BCE and 400,000 BCE.Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.
Clothing and shelterOther technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity's progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380,000 BCE, humans were constructing temporary wood huts.Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200,000 BCE and into other continents, such as Eurasia.
Humans began to work bones, antler, and hides, as evidenced by burins and racloirs produced during this period.[citation needed]
Neolithic through Classical Antiquity (10,000BCE – 300CE) An array of Neolithic artifacts, including bracelets, axe heads, chisels, and polishing tools.Man's technological ascent began in earnest in what is known as the Neolithic period ("New stone age"). The invention of polished stone axes was a major advance because it allowed forest clearance on a large scale to create farms. The discovery of agriculture allowed for the feeding of larger populations, and the transition to a sedentist lifestyle increased the number of children that could be simultaneously raised, as young children no longer needed to be carried, as was the case with the nomadic lifestyle. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer lifestyle.
With this increase in population and availability of labor came an increase in labor specialization.[29] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures, the specialization of labor, trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges, such as the building of dikes and reservoirs, are all thought to have played a role.[30]
Metal toolsContinuing improvements led to the furnace and bellows and provided the ability to smelt and forge native metals (naturally occurring in relatively pure form).[31] Gold, copper, silver, and lead, were such early metals. The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 8000 BCE).[32] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1400 BCE.
Energy and TransportMeanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailboat.[citation needed] The earliest record of a ship under sail is shown on an Egyptian pot dating back to 3200 BCE.[citation needed] From prehistoric times, Egyptians probably used "the power of the Nile" annual floods to irrigate their lands, gradually learning to regulate much of it through purposely-built irrigation channels and 'catch' basins. Similarly, the early peoples of Mesopotamia, the Sumerians, learned to use the Tigris and Euphrates rivers for much the same purposes. But more extensive use of wind and water (and even human) power required another invention.
The wheel was invented in circa 4000 BCE.According to archaeologists, the wheel was invented around 4000 B.C. The wheel was likely independently invented in Mesopotamia (in present-day Iraq) as well. Estimates on when this may have occurred range from 5500 to 3000 B.C., with most experts putting it closer to 4000 B.C. The oldest artifacts with drawings that depict wheeled carts date from about 3000 B.C.; however, the wheel may have been in use for millennia before these drawings were made. There is also evidence from the same period of time that wheels were used for the production of pottery. (Note that the original potter's wheel was probably not a wheel, but rather an irregularly shaped slab of flat wood with a small hollowed or pierced area near the center and mounted on a peg driven into the earth. It would have been rotated by repeated tugs by the potter or his assistant.) More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[33]
The invention of the wheel revolutionized activities as disparate as transportation, war, and the production of pottery (for which it may have been first used). It didn't take long to discover that wheeled wagons could be used to carry heavy loads and fast (rotary) potters' wheels enabled early mass production of pottery. But it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources.
Modern history (0CE —)Tools include both simple machines (such as the lever, the screw, and the pulley), and more complex machines (such as the clock, the engine, the electric generator and the electric motor, the computer, radio, and the Space Station, among many others). An integrated circuit — a key foundation for modern computers. As tools increase in complexity, so does the type of knowledge needed to support them. Complex modern machines require libraries of written technical manuals of collected information that has continually increased and improved — their designers, builders, maintainers, and users often require the mastery of decades of sophisticated general and specific training. Moreover, these tools have become so complex that a comprehensive infrastructure of technical knowledge-based lesser tools, processes and practices (complex tools in themselves) exist to support them, including engineering, medicine, and computer science. Complex manufacturing and construction techniques and organizations are needed to construct and maintain them. Entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. The relationship of technology with society ( culture) is generally characterized as synergistic, symbiotic, co-dependent, co-influential, and co-producing, i.e. technology and society depend heavily one upon the other (technology upon culture, and culture upon technology). It is also generally believed that this synergistic relationship first occurred at the dawn of humankind with the invention of simple tools, and continues with modern technologies today. Today and throughout history, technology influences and is influenced by such societal issues/factors as economics, values, ethics, institutions, groups, the environment, government, among others. The discipline studying the impacts of science, technology, and society and vice versa is called Science and technology in society.
Technology and philosophy
TechnicismGenerally, technicism is an over reliance or overconfidence in technology as a benefactor of society.
Taken to extreme, some argue that technicism is the belief that humanity will ultimately be able to control the entirety of existence using technology. In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Monsma,[34] connect these ideas to the abdication of religion as a higher moral authority.
More commonly, technicism is a criticism of the commonly held belief that newer, more recently-developed technology is "better." For example, more recently-developed computers are faster than older computers, and more recently-developed cars have greater gas efficiency and more features than older cars. Because current technologies are generally accepted as good, future technological developments are not considered circumspectly, resulting in what seems to be a blind acceptance of technological development.
OptimismSee also: Extropianism Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good. Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[35]
PessimismSee also: Luddite, Neo-luddism, Anarcho-Primitivism, and Bioconservatism On the somewhat pessimistic side are certain philosophers like the Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed a priori. They suggest that the result of such a society is to become evermore technological at the cost of freedom and psychological health (and probably physical health in general, as pollution from technological products is dispersed).
Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious reservations, although not a priori flawed reservations, about technology. Heidegger presents such a view in "The Question Concerning Technology": "Thus we shall never experience our relationship to the essence of technology so long as we merely conceive and push forward the technological, put up with it, or evade it. Everywhere we remain unfree and chained to technology, whether we passionately affirm or deny it."[36]
Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics, for example Aldous Huxley's Brave New World and other writings, Anthony Burgess's A Clockwork Orange, and George Orwell's Nineteen Eighty-Four. And, in Faust by Goethe, Faust's selling his soul to the devil in return for power over the physical world, is also often interpreted as a metaphor for the adoption of industrial technology.
An overtly anti-technological treatise is Industrial Society and Its Future, written by Theodore Kaczynski (aka The Unabomber) and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure.
Appropriate technologySee also: Technocriticism and Technorealism The notion of appropriate technology, however, was developed in the 20th century (e.g., see the work of Jacques Ellul) to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The eco-village movement emerged in part due to this concern.
Other species Credit: Public Library of ScienceThis adult gorilla uses a branch as a walking stick to gauge the water's depth; an example of technology usage by primates.The use of basic technology is also a feature of other species apart from humans. These include primates such as chimpanzees, some dolphin communities,[37][38] and crows.[39][40]
The ability to make and use tools was once considered a defining characteristic of the genus Homo.[41] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[42] West African chimpanzees also use stone hammers and anvils for cracking nuts.[43]
See also Technology and applied sciences portal Find more about Technology on Wikipedia's sister projects: Dictionary definitions Textbooks Quotations Source texts Images and media News stories Learning resources Main article: List of basic technology topicsList of emerging technologies Bernard Stiegler Golden hammer Critique of technology Game-changing technology High technology History of science and technology Innovation Internet Knowledge economy Lewis Mumford Luddite Technology assessment Timeline of invention Technological convergence Technology tree List of "ologies" Science and technology Technological superpowers
Theories and concepts in technologyMain article: Theories of technologyAppropriate technology Diffusion of innovations Paradigm Philosophy of technology Posthumanism Precautionary principle Strategy of technology Techno-progressivism Technocriticism Technological evolution Technological determinism Technological nationalism Technological singularity Technological society Technorealism Technological revival Transhumanism Technology Management
Economics of technologyTechnocapitalism Technological diffusion Technology acceptance model Technology lifecycle Technology transfer
Notes^ a b "Definition of technology". Merriam-Webster. Retrieved on 2007-02-16. ^ Franklin, Ursula. "Real World of Technology". House of Anansi Press. Retrieved on 2007-02-13. ^ "Technology news". BBC News. Retrieved on 2006-02-17. ^ Stiegler, Bernard (1998). Technics and Time, 1: The Fault of Epimetheus. Stanford University Press, 17, 82. ISBN 0-8047-3041-3. ^ "Industry, Technology and the Global Marketplace: International Patenting Trends in Two New Technology Areas". Science and Engineering Indicators 2002. National Science Foundation. Retrieved on 2007-05-07. ^ Borgmann, Albert (2006). "Technology as a Cultural Force: For Alena and Griffin" (fee required). The Canadian Journal of Sociology 31 (3): 351–360. doi:10.1353/cjs.2006.0050. Retrieved on 2007-02-16. ^ Macek, Jakub. "Defining Cyberculture". Retrieved on 2007-05-25. ^ "Science". Dictionary.com. Retrieved on 2007-02-17. ^ "Intute: Science, Engineering and Technology". Intute. Retrieved on 2007-02-17. ^ "Mother of man - 3.2 million years ago". BBC. Retrieved on 2008-05-17. ^ "Human Evolution". History channel. Retrieved on 2008-05-17. ^ Wade, Nicholas (2003-07-15). "Early Voices: The Leap to Language". The New York Times. Retrieved on 2008-05-17. ^ "Human Ancestors Hall: Homo sapiens". Smithsonian Institution. Retrieved on 2007-12-08. ^ "Ancient 'tool factory' uncovered". BBC News (1999-05-06). Retrieved on 2007-02-18. ^ Heinzelin, Jean de (April 1999). "Environment and Behavior of 2.5-Million-Year-Old Bouri Hominids". Science 284 (5414): 625–629. doi:10.1126/science.284.5414.625. PMID 10213682. ^ a b Burke, Ariane. "Archaeology". Encyclopedia Americana. Retrieved on 2008-05-17. ^ Plummer, Thomas (2004). "Flaked Stones and Old Bones: Biological and Cultural Evolution at the Dawn of Technology" (47). Yearbook of Physical Anthropology. ^ Haviland, William A. (2004). Cultural Anthropology: The Human Challenge. The Thomson Corporation, 77. ISBN 0534624871. ^ Crump, Thomas (2001). A Brief History of Science. Constable & Robinson, 9. ISBN 1-84119-235-X. ^ "Fossil Hominid Sites of Sterkfontein, Swartkrans, Kromdraai, and Environs". UNESCO. Retrieved on 2007-03-10. ^ "History of Stone Age Man". History World. Retrieved on 2007-02-13. ^ James, Steven R. (February 1989). "Hominid Use of Fire in the Lower and Middle Pleistocene" (fee required). Current Anthropology 30 (1): 1–26. doi:10.1086/203705. ^ Stahl, Ann B. (1984). "Hominid dietary selection before fire" (fee required). Current Anthropology 25: 151–168. doi:10.1086/203106. ^ O'Neil, Dennis. "Evolution of Modern Humans: Archaic Homo sapiens Culture". Palomar College. Retrieved on 2007-03-31. ^ Villa, Paola (1983). Terra Amata and the Middle Pleistocene archaeological record of southern France. Berkeley: University of California Press, 303. ISBN 0-520-09662-2. ^ Cordaux, Richard; Stoneking, Mark (2003). "South Asia, the Andamanese and the genetic evidence for an "early" human dispersal out of Africa". American Journal of Human Genetics 72: 1586. doi:10.1086/375407. ^ "The First Baby Boom: Skeletal Evidence Shows Abrupt Worldwide Increase In Birth Rate During Neolithic Period". Science Daily (2006-01-04). Retrieved on 2008-05-17. ^ Sussman, Robert W.; Hall, Roberta L. (April 1972). "Child Transport, Family Size, and Increase in Human Population During the Neolithic". Current Anthropology 13 (2): 258–267. University of Chicago Press. doi:10.1086/201274. Retrieved on 2008-05-17. ^ Ferraro, Gary P. (2006). "Cultural Anthropology: An Applied Perspective". The Thomson Corporation. Retrieved on 2008-05-17. ^ Patterson, Gordon M. (1992). "The ESSENTIALS of Ancient History". Research & Education Association. Retrieved on 2008-05-17. ^ Cramb, Alan W. "A Short History of Metals". Carnegie Mellon University. Retrieved on 2007-01-08. ^ Chisholm, Hugh (1910). "{{{title}}}".. Encyclopædia Britannica, Retrieved on 2008-05-17. ^ "Slovenian Marsh Yields World's Oldest Wheel". Ameriška Domovina (2003-03-27). Retrieved on 2007-02-13. ^ Monsma, Stephen V. (1986). Responsible Technology. Grand Rapids: W.B. Eerdmans Pub. Co. ^ Hughes, James (2002). "Democratic Transhumanism 2.0". Retrieved on 2007-01-26. ^ Lovitt, William (1977). "The Question Concerning Technology", The Question Concerning Technology and Other Essays. Harper Torchbooks, 3–35. Retrieved on 2007-11-21. ^ Sagan, Carl; Druyan, Ann; Leakey, Richard. "Chimpanzee Tool Use". Retrieved on 2007-02-13. ^ Rincon, Paul (2005-06-07). "Sponging dolphins learn from mum.". BBC News. Retrieved on 2007-02-13. ^ Schmid, Randolph E. (2007-10-04). "Crows use tools to find food". MSNBC. Retrieved on 2008-05-17. ^ Rutz, C.; Bluff, L.A.; Weir, A.A.S.; Kacelnik, A. (2007-10-04). "Video cameras on wild birds". Science. ^ Oakley, K. P. (1976). Man the Tool-Maker. University of Chicago Press. ISBN 978-0226612706. ^ McGrew, W. C (1992). Chimpanzee Material Culture. ISBN 978-0521423717. ^ Boesch, Christophe; Boesch, Hedwige (1984). "Mental map in wild chimpanzees: An analysis of hammer transports for nut cracking" (fee required). Primates 25 (25): 160–170. doi:10.1007/BF02382388.
ReferencesAmbrose, Stanley H. (2001-03-02). "Paleolithic Technology and Human Evolution". Science. Retrieved on 2007-03-10.
Further readingKremer, Michael. 1993. "Population Growth and Technological Change: One Million B.C. to 1990", The Quarterly Journal of Economics 108(3): 681-716. [hide]v • d • eMajor fields of technology Applied science Artificial intelligence · Ceramic engineering · Computing technology · Electronics · Energy · Energy storage · Engineering physics · Environmental technology · Fisheries science · Materials science and engineering · Microtechnology · Nanotechnology · Nuclear technology · Optics · Zoography Information Communication · Graphics · Music technology · Speech recognition · Visual technology Industry Construction · Financial engineering · Manufacturing · Machinery · Mining · Business informatics Military Ammunition · Bombs · Guns · Military technology and equipment · Naval engineering Domestic Educational technology · Domestic appliances · Domestic technology · Food technology Engineering Aerospace · Agricultural · Architectural · Audio · Automotive · Biological · Biochemical · Biomedical · Broadcast · Ceramic · Chemical · Civil · Computer · Construction · Cryogenic · Electrical · Electronic · Environmental · Food · Industrial · Materials · Mechanical · Mechatronics · Metallurgical · Mining · Naval · Nuclear · Optical · Petroleum · Software · Structural · Systems · Textile · Tissue · Transport Health and safety Biomedical engineering · Bioinformatics · Biotechnology · Cheminformatics · Fire protection engineering · Health technologies · Nutrition · Pharmaceuticals · Safety engineering · Sanitary engineering Transport Aerospace · Aerospace engineering · Automotive engineering · Marine engineering · Motor vehicles · Space technology
The human race's use of technology began with the conversion of natural resources into simple tools. The prehistorical discovery of the ability to control fire increased the available sources of food and the invention of the wheel helped humans in travelling in and controlling their environment. Recent technological developments, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact on a global scale. However, not all technology has been used for peaceful purposes; the development of weapons of ever-increasing destructive power has progressed throughout history, from clubs to nuclear weapons.
Technology has affected society and its surroundings in a number of ways. In many societies, technology has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of the Earth and its environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.
Philosophical debates have arisen over the present and future use of technology in society, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar movements criticise the pervasiveness of technology in the modern world, claiming that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition. Indeed, until recently, it was believed that the development of technology was restricted only to human beings, but recent scientific studies indicate that other primates and certain dolphin communities have developed simple tools and learned to pass their knowledge to other generations.
Contents [hide]1 Definition and usage 2 Science, engineering and technology 3 Role in human history 3.1 Paleolithic (2.5 million – 10,000 BC) 3.1.1 Stone tools 3.1.2 Fire 3.1.3 Clothing and shelter 3.2 Neolithic through Classical Antiquity (10,000BCE – 300CE) 3.2.1 Metal tools 3.2.2 Energy and Transport 3.3 Modern history (0CE —) 4 Technology and philosophy 4.1 Technicism 4.2 Optimism 4.3 Pessimism 4.4 Appropriate technology 5 Other species 6 See also 6.1 Theories and concepts in technology 6.2 Economics of technology 7 Notes 8 References 9 Further reading
Definition and usage The invention of the printing press made it possible for scientists and politicians to communicate their ideas with ease, leading to the Age of Enlightenment; an example of technology as a cultural force.In general technology is the relationship that society has with its tools and crafts, and to what extent society can control its environment. The Merriam-Webster dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge".Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here".The term is often used to imply a specific field of technology, or to refer to high technology, rather than technology as a whole.Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter."
Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.
The word "technology" can also be used to refer to a collection of techniques. In this context, it is the current state of humanity's knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as "medical technology" or "space technology", it refers to the state of the respective field's knowledge and tools. "State-of-the-art technology" refers to the high technology available to humanity in any field.
Technology can be viewed as an activity that forms or changes culture.Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and, as a result, has helped spawn new subcultures; the rise of cyberculture has, at its basis, the development of the Internet and the computer.Not all technology enhances culture in a creative way; technology can also help facilitate political oppression and war via tools such as guns. As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.
Science, engineering and technologyThe distinction between science, engineering and technology is not always clear. Science is the reasoned investigation or study of phenomena, aimed at discovering enduring principles among elements of the phenomenal world by employing formal techniques such as the scientific method.Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability and safety.
Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.
Technology is often a consequence of science and engineering — although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors, by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines, such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.
Role in human historyMain articles: History of technology and Timeline of invention
Paleolithic (2.5 million – 10,000 BC) A primitive chopperThe use of tools by early humans was partly a process of discovery, partly of evolution. Early humans evolved from a race of foraging hominids which were already bipedal, with a brain mass approximately one third that of modern humans.Tool use remained relatively unchanged for most of early human history, but approximately 50,000 years ago, a complex set of behaviors and tool use emerged, believed by many archaeologists to be connected to the emergence of fully-modern language.
Stone tools Hand axes from the Acheulian period A Clovis point, made via pressure flakingHuman ancestors have been using stone and other tools since long before the emergence of Homo sapiens approximately 200,000 years ago.The earliest methods of stone tool making, known as the Oldowan "industry", date back to at least 2.3 million years ago,with the earliest direct evidence of tool usage found in Ethiopia within the Great Rift Valley, dating back to 2.5 million years ago.This era of stone tool use is called the Paleolithic, or "Old stone age", and spans all of human history up to the development of agriculture approximately 12,000 years ago.
To make a stone tool, a "core" of hard stone with specific flaking properties (such as flint) was struck with a hammerstone. This flaking produced a sharp edge on the core stone as well as on the flakes, either of which could be used as tools, primarily in the form of choppers or scrapers.These tools greatly aided the early humans in their hunter-gatherer lifestyle to perform a variety of tasks including butchering carcasses (and breaking bones to get at the marrow); chopping wood; cracking open nuts; skinning an animal for its hide; and even forming other tools out of softer materials such as bone and wood.
The earliest stone tools were crude, being little more than a fractured rock. In the Acheulian era, beginning approximately 1.65 million years ago, methods of working these stone into specific shapes, such as hand axes emerged. The Middle Paleolithic, approximately 300,000 years ago, saw the introduction of the prepared-core technique, where multiple blades could be rapidly formed from a single core stone.The Upper Paleolithic, beginning approximately 40,000 years ago, saw the introduction of pressure flaking, where a wood, bone, or antler punch could be used to shape a stone very finely.
FireThe discovery and utilization of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1,000,000 BCE; scholarly consensus indicates that Homo erectus had controlled fire by between 500,000 BCE and 400,000 BCE.Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.
Clothing and shelterOther technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity's progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380,000 BCE, humans were constructing temporary wood huts.Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200,000 BCE and into other continents, such as Eurasia.
Humans began to work bones, antler, and hides, as evidenced by burins and racloirs produced during this period.[citation needed]
Neolithic through Classical Antiquity (10,000BCE – 300CE) An array of Neolithic artifacts, including bracelets, axe heads, chisels, and polishing tools.Man's technological ascent began in earnest in what is known as the Neolithic period ("New stone age"). The invention of polished stone axes was a major advance because it allowed forest clearance on a large scale to create farms. The discovery of agriculture allowed for the feeding of larger populations, and the transition to a sedentist lifestyle increased the number of children that could be simultaneously raised, as young children no longer needed to be carried, as was the case with the nomadic lifestyle. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer lifestyle.
With this increase in population and availability of labor came an increase in labor specialization.[29] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures, the specialization of labor, trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges, such as the building of dikes and reservoirs, are all thought to have played a role.[30]
Metal toolsContinuing improvements led to the furnace and bellows and provided the ability to smelt and forge native metals (naturally occurring in relatively pure form).[31] Gold, copper, silver, and lead, were such early metals. The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 8000 BCE).[32] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1400 BCE.
Energy and TransportMeanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailboat.[citation needed] The earliest record of a ship under sail is shown on an Egyptian pot dating back to 3200 BCE.[citation needed] From prehistoric times, Egyptians probably used "the power of the Nile" annual floods to irrigate their lands, gradually learning to regulate much of it through purposely-built irrigation channels and 'catch' basins. Similarly, the early peoples of Mesopotamia, the Sumerians, learned to use the Tigris and Euphrates rivers for much the same purposes. But more extensive use of wind and water (and even human) power required another invention.
The wheel was invented in circa 4000 BCE.According to archaeologists, the wheel was invented around 4000 B.C. The wheel was likely independently invented in Mesopotamia (in present-day Iraq) as well. Estimates on when this may have occurred range from 5500 to 3000 B.C., with most experts putting it closer to 4000 B.C. The oldest artifacts with drawings that depict wheeled carts date from about 3000 B.C.; however, the wheel may have been in use for millennia before these drawings were made. There is also evidence from the same period of time that wheels were used for the production of pottery. (Note that the original potter's wheel was probably not a wheel, but rather an irregularly shaped slab of flat wood with a small hollowed or pierced area near the center and mounted on a peg driven into the earth. It would have been rotated by repeated tugs by the potter or his assistant.) More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[33]
The invention of the wheel revolutionized activities as disparate as transportation, war, and the production of pottery (for which it may have been first used). It didn't take long to discover that wheeled wagons could be used to carry heavy loads and fast (rotary) potters' wheels enabled early mass production of pottery. But it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources.
Modern history (0CE —)Tools include both simple machines (such as the lever, the screw, and the pulley), and more complex machines (such as the clock, the engine, the electric generator and the electric motor, the computer, radio, and the Space Station, among many others). An integrated circuit — a key foundation for modern computers. As tools increase in complexity, so does the type of knowledge needed to support them. Complex modern machines require libraries of written technical manuals of collected information that has continually increased and improved — their designers, builders, maintainers, and users often require the mastery of decades of sophisticated general and specific training. Moreover, these tools have become so complex that a comprehensive infrastructure of technical knowledge-based lesser tools, processes and practices (complex tools in themselves) exist to support them, including engineering, medicine, and computer science. Complex manufacturing and construction techniques and organizations are needed to construct and maintain them. Entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. The relationship of technology with society ( culture) is generally characterized as synergistic, symbiotic, co-dependent, co-influential, and co-producing, i.e. technology and society depend heavily one upon the other (technology upon culture, and culture upon technology). It is also generally believed that this synergistic relationship first occurred at the dawn of humankind with the invention of simple tools, and continues with modern technologies today. Today and throughout history, technology influences and is influenced by such societal issues/factors as economics, values, ethics, institutions, groups, the environment, government, among others. The discipline studying the impacts of science, technology, and society and vice versa is called Science and technology in society.
Technology and philosophy
TechnicismGenerally, technicism is an over reliance or overconfidence in technology as a benefactor of society.
Taken to extreme, some argue that technicism is the belief that humanity will ultimately be able to control the entirety of existence using technology. In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Monsma,[34] connect these ideas to the abdication of religion as a higher moral authority.
More commonly, technicism is a criticism of the commonly held belief that newer, more recently-developed technology is "better." For example, more recently-developed computers are faster than older computers, and more recently-developed cars have greater gas efficiency and more features than older cars. Because current technologies are generally accepted as good, future technological developments are not considered circumspectly, resulting in what seems to be a blind acceptance of technological development.
OptimismSee also: Extropianism Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good. Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[35]
PessimismSee also: Luddite, Neo-luddism, Anarcho-Primitivism, and Bioconservatism On the somewhat pessimistic side are certain philosophers like the Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed a priori. They suggest that the result of such a society is to become evermore technological at the cost of freedom and psychological health (and probably physical health in general, as pollution from technological products is dispersed).
Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious reservations, although not a priori flawed reservations, about technology. Heidegger presents such a view in "The Question Concerning Technology": "Thus we shall never experience our relationship to the essence of technology so long as we merely conceive and push forward the technological, put up with it, or evade it. Everywhere we remain unfree and chained to technology, whether we passionately affirm or deny it."[36]
Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics, for example Aldous Huxley's Brave New World and other writings, Anthony Burgess's A Clockwork Orange, and George Orwell's Nineteen Eighty-Four. And, in Faust by Goethe, Faust's selling his soul to the devil in return for power over the physical world, is also often interpreted as a metaphor for the adoption of industrial technology.
An overtly anti-technological treatise is Industrial Society and Its Future, written by Theodore Kaczynski (aka The Unabomber) and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure.
Appropriate technologySee also: Technocriticism and Technorealism The notion of appropriate technology, however, was developed in the 20th century (e.g., see the work of Jacques Ellul) to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The eco-village movement emerged in part due to this concern.
Other species Credit: Public Library of ScienceThis adult gorilla uses a branch as a walking stick to gauge the water's depth; an example of technology usage by primates.The use of basic technology is also a feature of other species apart from humans. These include primates such as chimpanzees, some dolphin communities,[37][38] and crows.[39][40]
The ability to make and use tools was once considered a defining characteristic of the genus Homo.[41] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees utilising tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[42] West African chimpanzees also use stone hammers and anvils for cracking nuts.[43]
See also Technology and applied sciences portal Find more about Technology on Wikipedia's sister projects: Dictionary definitions Textbooks Quotations Source texts Images and media News stories Learning resources Main article: List of basic technology topicsList of emerging technologies Bernard Stiegler Golden hammer Critique of technology Game-changing technology High technology History of science and technology Innovation Internet Knowledge economy Lewis Mumford Luddite Technology assessment Timeline of invention Technological convergence Technology tree List of "ologies" Science and technology Technological superpowers
Theories and concepts in technologyMain article: Theories of technologyAppropriate technology Diffusion of innovations Paradigm Philosophy of technology Posthumanism Precautionary principle Strategy of technology Techno-progressivism Technocriticism Technological evolution Technological determinism Technological nationalism Technological singularity Technological society Technorealism Technological revival Transhumanism Technology Management
Economics of technologyTechnocapitalism Technological diffusion Technology acceptance model Technology lifecycle Technology transfer
Notes^ a b "Definition of technology". Merriam-Webster. Retrieved on 2007-02-16. ^ Franklin, Ursula. "Real World of Technology". House of Anansi Press. Retrieved on 2007-02-13. ^ "Technology news". BBC News. Retrieved on 2006-02-17. ^ Stiegler, Bernard (1998). Technics and Time, 1: The Fault of Epimetheus. Stanford University Press, 17, 82. ISBN 0-8047-3041-3. ^ "Industry, Technology and the Global Marketplace: International Patenting Trends in Two New Technology Areas". Science and Engineering Indicators 2002. National Science Foundation. Retrieved on 2007-05-07. ^ Borgmann, Albert (2006). "Technology as a Cultural Force: For Alena and Griffin" (fee required). The Canadian Journal of Sociology 31 (3): 351–360. doi:10.1353/cjs.2006.0050. Retrieved on 2007-02-16. ^ Macek, Jakub. "Defining Cyberculture". Retrieved on 2007-05-25. ^ "Science". Dictionary.com. Retrieved on 2007-02-17. ^ "Intute: Science, Engineering and Technology". Intute. Retrieved on 2007-02-17. ^ "Mother of man - 3.2 million years ago". BBC. Retrieved on 2008-05-17. ^ "Human Evolution". History channel. Retrieved on 2008-05-17. ^ Wade, Nicholas (2003-07-15). "Early Voices: The Leap to Language". The New York Times. Retrieved on 2008-05-17. ^ "Human Ancestors Hall: Homo sapiens". Smithsonian Institution. Retrieved on 2007-12-08. ^ "Ancient 'tool factory' uncovered". BBC News (1999-05-06). Retrieved on 2007-02-18. ^ Heinzelin, Jean de (April 1999). "Environment and Behavior of 2.5-Million-Year-Old Bouri Hominids". Science 284 (5414): 625–629. doi:10.1126/science.284.5414.625. PMID 10213682. ^ a b Burke, Ariane. "Archaeology". Encyclopedia Americana. Retrieved on 2008-05-17. ^ Plummer, Thomas (2004). "Flaked Stones and Old Bones: Biological and Cultural Evolution at the Dawn of Technology" (47). Yearbook of Physical Anthropology. ^ Haviland, William A. (2004). Cultural Anthropology: The Human Challenge. The Thomson Corporation, 77. ISBN 0534624871. ^ Crump, Thomas (2001). A Brief History of Science. Constable & Robinson, 9. ISBN 1-84119-235-X. ^ "Fossil Hominid Sites of Sterkfontein, Swartkrans, Kromdraai, and Environs". UNESCO. Retrieved on 2007-03-10. ^ "History of Stone Age Man". History World. Retrieved on 2007-02-13. ^ James, Steven R. (February 1989). "Hominid Use of Fire in the Lower and Middle Pleistocene" (fee required). Current Anthropology 30 (1): 1–26. doi:10.1086/203705. ^ Stahl, Ann B. (1984). "Hominid dietary selection before fire" (fee required). Current Anthropology 25: 151–168. doi:10.1086/203106. ^ O'Neil, Dennis. "Evolution of Modern Humans: Archaic Homo sapiens Culture". Palomar College. Retrieved on 2007-03-31. ^ Villa, Paola (1983). Terra Amata and the Middle Pleistocene archaeological record of southern France. Berkeley: University of California Press, 303. ISBN 0-520-09662-2. ^ Cordaux, Richard; Stoneking, Mark (2003). "South Asia, the Andamanese and the genetic evidence for an "early" human dispersal out of Africa". American Journal of Human Genetics 72: 1586. doi:10.1086/375407. ^ "The First Baby Boom: Skeletal Evidence Shows Abrupt Worldwide Increase In Birth Rate During Neolithic Period". Science Daily (2006-01-04). Retrieved on 2008-05-17. ^ Sussman, Robert W.; Hall, Roberta L. (April 1972). "Child Transport, Family Size, and Increase in Human Population During the Neolithic". Current Anthropology 13 (2): 258–267. University of Chicago Press. doi:10.1086/201274. Retrieved on 2008-05-17. ^ Ferraro, Gary P. (2006). "Cultural Anthropology: An Applied Perspective". The Thomson Corporation. Retrieved on 2008-05-17. ^ Patterson, Gordon M. (1992). "The ESSENTIALS of Ancient History". Research & Education Association. Retrieved on 2008-05-17. ^ Cramb, Alan W. "A Short History of Metals". Carnegie Mellon University. Retrieved on 2007-01-08. ^ Chisholm, Hugh (1910). "{{{title}}}".. Encyclopædia Britannica, Retrieved on 2008-05-17. ^ "Slovenian Marsh Yields World's Oldest Wheel". Ameriška Domovina (2003-03-27). Retrieved on 2007-02-13. ^ Monsma, Stephen V. (1986). Responsible Technology. Grand Rapids: W.B. Eerdmans Pub. Co. ^ Hughes, James (2002). "Democratic Transhumanism 2.0". Retrieved on 2007-01-26. ^ Lovitt, William (1977). "The Question Concerning Technology", The Question Concerning Technology and Other Essays. Harper Torchbooks, 3–35. Retrieved on 2007-11-21. ^ Sagan, Carl; Druyan, Ann; Leakey, Richard. "Chimpanzee Tool Use". Retrieved on 2007-02-13. ^ Rincon, Paul (2005-06-07). "Sponging dolphins learn from mum.". BBC News. Retrieved on 2007-02-13. ^ Schmid, Randolph E. (2007-10-04). "Crows use tools to find food". MSNBC. Retrieved on 2008-05-17. ^ Rutz, C.; Bluff, L.A.; Weir, A.A.S.; Kacelnik, A. (2007-10-04). "Video cameras on wild birds". Science. ^ Oakley, K. P. (1976). Man the Tool-Maker. University of Chicago Press. ISBN 978-0226612706. ^ McGrew, W. C (1992). Chimpanzee Material Culture. ISBN 978-0521423717. ^ Boesch, Christophe; Boesch, Hedwige (1984). "Mental map in wild chimpanzees: An analysis of hammer transports for nut cracking" (fee required). Primates 25 (25): 160–170. doi:10.1007/BF02382388.
ReferencesAmbrose, Stanley H. (2001-03-02). "Paleolithic Technology and Human Evolution". Science. Retrieved on 2007-03-10.
Further readingKremer, Michael. 1993. "Population Growth and Technological Change: One Million B.C. to 1990", The Quarterly Journal of Economics 108(3): 681-716. [hide]v • d • eMajor fields of technology Applied science Artificial intelligence · Ceramic engineering · Computing technology · Electronics · Energy · Energy storage · Engineering physics · Environmental technology · Fisheries science · Materials science and engineering · Microtechnology · Nanotechnology · Nuclear technology · Optics · Zoography Information Communication · Graphics · Music technology · Speech recognition · Visual technology Industry Construction · Financial engineering · Manufacturing · Machinery · Mining · Business informatics Military Ammunition · Bombs · Guns · Military technology and equipment · Naval engineering Domestic Educational technology · Domestic appliances · Domestic technology · Food technology Engineering Aerospace · Agricultural · Architectural · Audio · Automotive · Biological · Biochemical · Biomedical · Broadcast · Ceramic · Chemical · Civil · Computer · Construction · Cryogenic · Electrical · Electronic · Environmental · Food · Industrial · Materials · Mechanical · Mechatronics · Metallurgical · Mining · Naval · Nuclear · Optical · Petroleum · Software · Structural · Systems · Textile · Tissue · Transport Health and safety Biomedical engineering · Bioinformatics · Biotechnology · Cheminformatics · Fire protection engineering · Health technologies · Nutrition · Pharmaceuticals · Safety engineering · Sanitary engineering Transport Aerospace · Aerospace engineering · Automotive engineering · Marine engineering · Motor vehicles · Space technology
Subscribe to:
Posts (Atom)