56
JOURNEY TO DISCOVERY MADE BY – PUSHKAR KUMAR MAHTO

iventions and discoveries

Embed Size (px)

Citation preview

JOURNEY TO DISCOVERY

MADE BY – PUSHKAR KUMAR MAHTO

DISCOVERIES IN TRANSPORTATION

DISCOVERY OF WHEELS

Evidence of wheeled vehicles appears from the second half of the 4th millennium BC, near-simultaneously in Mesopotamia (Sumerian civilization), the Northern Caucasus (Maykop culture) and Central Europe, so that the question of which culture originally invented the wheeled vehicle is still unsolved.

The earliest well-dated depiction of a wheeled vehicle (here a wagon—four wheels, two axles) is on the Bronocice pot, a c. 3500 – 3350 BC clay pot excavated in a Funnel beaker culture settlement in southern Poland.

The oldest securely dated real wheel-axle combination, that from Stare Gamine near Ljubljana in Slovenia (Ljubljana Marshes Wooden Wheel) is now dated in 2σ-limits to 3340-3030 cal BC, the axle to 3360-3045 cal BC

In China, the wheel was certainly present with the adoption of the chariot in c. 1200 BC,although Barbieri-Low argues for earlier Chinese wheeled vehicles, c. 2000 BC.

Although they did not develop the wheel proper, the Olmec and certain other western hemisphere cultures seem to have approached it, as wheel-like worked stones have been found on objects identified as children's toys dating to about 1500 BC. It is thought that the primary obstacle to large-scale development of the wheel in the Western hemisphere was the absence of domesticated large animals which could be used to pull wheeled carriages.[citation needed] The closest relative of cattle present in Americas in pre-Columbian times, the American Bison, is difficult to domesticate and was never domesticated by Native Americans; several horse species existed until about 12,000 years ago, but ultimately went

Discovery OF Metro

built after an approximately 20-year battle,starting in 1983 when voters rejected a rail plan by referendum.A voter referendum in 1988 approved a 20-mile (32 km) light rail plan;however, Bob Lanier was elected mayor in 1992 and stopped the plan.In 1991, U.S. Rep. Tom DeLay, removed $65 million in This line was federal funding for the rail line.Then, Houston drew up a rail plan with entirely local funding. In 2001, several groups sued to stop construction, claiming that the METRO organization was a "private business" and subject to Houston City Charter provisions regulating business use of its streets; they obtained 2 temporary injunctions in January 2001, which were reversed by appeals court on March 9, 2001.

Discovery OF pipeline

Pipelines exist for the transport of crude and refined petroleum, fuels - such as oil, natural gas and biofuels - and other fluids including sewage, slurry, water, and beer. Pipelines are useful for transporting water for drinking or irrigation over long distances when it needs to move over hills, or where canals or channels are poor choices due to considerations of evaporation, pollution, or environmental impact. Pneumatic tubes using compressed air can be used to transport solid capsules.

DESCOVERIES IN

AGRICULTURE

Discovery OF pesticides Since before 2000 BC, humans have utilized

pesticides to protect their crops. The first known pesticide was elemental sulfur dusting used in ancient Sumer about 4,500 years ago in ancient Mesopotamia. The Rig Veda, which is about 4,000 years old, mentions the use of poisonous plants for pest control. By the 15th century, toxic chemicals such as arsenic, mercury, and lead were being applied to crops to kill pests. In the 17th century, nicotine sulfate was extracted from tobacco leaves for use as an insecticide. The 19th century saw the introduction of two more natural pesticides, pyrethrum, which is derived from chrysanthemums, and rotenone, which is derived from the roots of tropical vegetables.

Until the 1950s, arsenic-based pesticides were dominant. Paul Müller discovered that DDT was a very effective insecticide. Organochlorines such as DDT were dominant, but they were replaced in the U.S. by organophosphates and carbamates by 1975. Since then, pyrethrin compounds have become the dominant insecticide. Herbicides became common in the 1960s, led by "triazine and other nitrogen-based compounds, carboxylic acids such as 2,4-dichlorophenoxyacetic acid, and glyphosate".

Weedicide Prior to the widespread use of chemical herbicides,

cultural controls, such as altering soil pH, salinity, or fertility levels, were used to control weeds. Mechanical control (including tillage) was also (and still is) used to control weeds. Although research into chemical herbicides began in the early 20th century, the first major breakthrough was the result of research conducted in both the UK and the US during the Second World War into the potential use of agents as biological weapons. The first modern herbicide, 2,4-D, was first discovered and synthesized by W. G. Temple man at Imperial Chemical Industries. In 1940, he showed that "Growth substances applied appropriately would kill certain broad-leaved weeds in cereals without harming the crops." By 1941, his team succeeded in synthesizing the chemical. In the same year, Poorly in the US achieved this as well. Independently, a team under Jude Hirsch Quested, working at the Rothamsted Experimental Station made the same discovery.

HYV Seeds In 1961, India was on the brink of mass famine.

Norman Borlaug was invited to India by the adviser to the Indian minister of agriculture C. Subramanian. Despite bureaucratic hurdles imposed by India's grain monopolies, the Ford Foundation and Indian government collaborated to import wheat seed from the International Maize and Wheat Improvement Center (CIMMYT). Punjab was selected by the Indian government to be the first site to try the new crops because of its reliable water supply and a history of agricultural success. India began its own Green Revolution program of plant breeding, irrigation development, and financing of agrochemicals. India soon adopted IR8 – a semi-dwarf rice variety developed by the International Rice Research Institute (IRRI) that could produce more grains of rice per plant when grown with certain fertilizers and irrigation.

In 1968, Indian agronomist S.K. De Data published his findings that IR8 rice yielded about 5 tons per hectare with no fertilizer, and almost 10 tons per hectare under optimal conditions. This was 10 times the yield of traditional rice.IR8 was a success throughout Asia, and dubbed the "Miracle Rice". IR8 was also developed into Semi-dwarf IR36. Wheat yields in developing countries, 1950 to 2004, kg/ha baseline 500. The steep rise in crop yields in the U.S. began in the 1940s. The percentage of growth was fastest in the early rapid growth stage. In developing countries maize yields are still rapidly rising. In the 1960s, rice yields in India were about two tons per hectare; by the mid-1990s, they had risen to six tons per hectare. In the 1970s, rice cost about $550 a ton; in 2001, it cost under $200 a tones. India became one of the world's most successful rice producers, and is now a major rice exporter, shipping nearly 4.5 million tons in 2006.

Combine harvesters

Scottish inventor Patrick Bell invented the reaper in 1826. The combine was invented in the United States by Hiram Moore in 1834, and early versions were pulled by horse or mule teams, ox. In 1835, Moore built a full-scale version and by 1839, over 50 acres of crops were harvested. By 1860, combine harvesters with a cutting width of several meters were used on American farms. In 1882, the Australian Hugh Victor McKay had a similar idea and developed the first commercial combine harvester in 1885, the Sunshine Harvester. Combines, some of them quite large, were drawn by mule or horse teams and used a bull wheel to provide power. Later, steam power was used, and George Stockton Berry integrated the combine with a steam engine using straw to heat the boiler.

DISCOVERIES IN IT

SECTOR

Electronic computers, using either relays or valves, began to appear in the early 1940s. The electromechanical Zeus Z3, completed in 1941, was the world's first programmable computer, and by modern standards one of the first machines that could be considered a complete computing machine. Colossus, developed during the Second World War to decrypt German messages was the first electronic digital computer. Although it was programmable, it was not general-purpose, being designed to perform only a single task. It also lacked the ability to store its program in memory; programming was carried out using plugs and switches to alter the internal wiring. The first recognizably modern electronic digital stored-program computer was the Manchester Small-Scale Experimental Machine (SSEM), which ran its first program on 21 June 1948.

IT

Discovery Of computer

The era for modern age computing is said to be started in the year 1837 when Charles Babbage invented first programmable computer to perform mathematical calculations. Later digital computer known as ENIAC have evolved in 1948 that was very large in size and used vacuum tubes. In around 1960’s vacuum tubes was replaced with semiconductors transistor and computers with these semiconductors are less expensive, consume low electricity and faster.  In 70’s integrated circuits was originated that lead to innovation of microprocessor by Intel and computers with IC’s were highly reliable, fastest and less pricey. Finally, in 1980’s home based personal computer (PC’s) progressed.  There are many other inventions like calculators performing mathematical operations and punch card machine for tabulating data led for discovery of computer. Charles Babbage has invented analytical engine in 1837 that was utilized in computational science. This engine was designed completely in 1933 which is a fusion of programmable and calculation giving rise to a computer used for multiple purposes.

Machine that could process 15-digit numbers and perform calculations for fourth order differences was endorsed by George Scheutz and Edward. One of the firms that made used of punch cards in computers to calculate 1890 census was US Census Bureau and this machine was developed by Herman Hollerith. Later this man in coordination with opponents in year 1911 fused and turned to corporation International Business Machines IBM which led for creation of many different computers till date. Charles Babbage with inspiration of desktop calculators developed by Thomas of Colmar had led his interest for development of computers. In the year 1812, he recognized that many long calculations required to create mathematical tables, were just a mere sequence of expected actions that were continually repeated. From this evaluation, he assumed that, it should be achievable to do these calculations automatically. Later, he started to design an automatic mechanical calculating machine that he termed as difference engine.  In the year1823, he initiated the invention of a difference engine with the financial support of British government. It was mainly designed to be completely automotive and was commanded by a fixed tutoring program, printing off the resulting tables. Even though it had limited flexibility and applicability, the difference engine was a huge advance in those days. Babbage soon developed an analytical engine, also termed as “Difference Engine No. 2” that replaced the predecessor in a quick time.

Discovery OF Internet The history of the Internet begins with the

development of electronic computers in the 1950s. Initial concepts of packet networking originated in several computer science laboratories in the United States, Great Britain, and France. The US Department of Defense awarded contracts as early as the 1960s for packet network systems, including the development of the ARPANET (which would become the first network to use the Internet Protocol.) The first message was sent over the ARPANET from computer science Professor Leonard Klein rock's laboratory at University of California, Los Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).

Packet switching networks such as ARPANET, Mark I at NPL in the UK, CYCLADES, Merit Network, Tenet, and Telnet, were developed in the late 1960s and early 1970s using a variety of communications protocols. The ARPANET in particular led to the development of protocols for internetworking, in which multiple separate networks could be joined into a network of networks.

Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet protocol suite (TCP/IP) was introduced as the standard networking protocol on the ARPANET. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations. Commercial Internet service providers (ISPs) began to emerge in the late 1980s. The ARPANET was decommissioned in 1990. Private connections to the Internet by commercial entities became widespread quickly, and the NSFNET was decommissioned in 1995, removing the last restrictions on the use of the Internet to carry commercial traffic. In the 1980s, the work of Tim Berners-Lee in the United Kingdom, on the World Wide Web, theorized the fact that protocols link hypertext documents into a working system, marking the beginning the modern Internet.

Since the mid-1990s, the Internet has had a revolutionary impact on culture and commerce, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as NSF's very high speed Backbone Network Service (vBNS), Internet2, and National Lambda Rail. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet's takeover of the global communication landscape was almost instant in historical terms: it only communicated 1% of the information flowing through two-way telecommunications networks in the year 1993, already 51% by 2000, and more than 97% of the telecommunicated information by 2007. Today the Internet continues to grow, driven by ever greater amounts of online information, commerce, entertainment, and social networking.

In the year 1890, with computers, the punch cards were first successfully utilized by Herman Hollerith (left) and James Powers, who worked for the US Census Bureau.   They developed some devices that could read the data punched into the cards automatically without human assistance. Through developing these devices, several improvements were attained like work flow improved; reading errors were decreased dramatically and mainly stacks of punched cards could be utilized as easily reach accessible memory of maximum size. These benefits were practically proved in some commercial firms such as IBM, Remington, and Burroughs etc.

The innovation of this digital computer rose by need of this equipment in Wars for making their task done at high speed rates. In year 1942 this was developed by John Eckert with his assistances and named as ENIAC (Electrical Numerical Integrator And Calculator) used by military for trajectory of weapons using tables. The engine of this digital computer was developed using 1800 vacuum tubes that can accept numerical word of size 10decimal digits and perform multiplication of two numbers at speed of 300 per second. This device physically occupied 18800 square feet and consumes electricity of 180,000 watts. This electronic digital computer is 1000 times faster than that of previous developments as, it is embedded with programs made up of instructions to function and control flow of data. Hence, with its features this computer was considered as first high speed electronic digital computer in 1946.

DISCOVERIES IN

MEDICAL SCIENCE

Treatment of cancer

research studies, test new treatments in people with cancer. The goal of this research is to find better ways to treat cancer and help cancer patients. Clinical trials test many types of treatment such as new drugs, new approaches to surgery or radiation therapy, new combinations of treatments, or new methods such as gene therapy. A clinical trial is one of the final stages of a long and careful cancer research process. The search for new treatments begins in the laboratory, where scientists first develop and test new ideas. If an approach seems promising, the next step may be testing a treatment in animals to see how it affects cancer in a living being and whether it has harmful effects.

Of course, treatments that work well in the lab or in animals do not always work well in people. Studies are done with cancer patients to find out whether promising treatments are safe and effective. Patients who take part may be helped personally by the treatment they receive. They get up-to-date care from cancer experts, and they receive either a new treatment being tested or the best available standard treatment for their cancer. At the same time, new treatments also may have unknown risks, but if a new treatment proves effective or more effective than standard treatment, study patients who receive it may be among the first to benefit. There is no guarantee that a new treatment being tested or a standard treatment will produce good results. In children with cancer, a survey of trials found that those enrolled in trials were on average not more likely to do better or worse than those on standard treatment; this confirms that success or failure of an experimental treatment cannot be predicted.

Treatment of AIDS

AIDS was first clinically observed in 1981 in the United States. The initial cases were a cluster of injecting drug users and homosexual men with no known cause of impaired immunity who showed symptoms of Pneumocystis carinii pneumonia (PCP), a rare opportunistic infection that was known to occur in people with very compromised immune systems. Soon thereafter, an unexpected number of homosexual men developed a previously rare skin cancer called KAPOSI’S sarcoma (KS). Many more cases of PCP and KS emerged, alerting U.S. Centers for Disease Control and Prevention (CDC) and a CDC task force was formed to monitor the outbreak.

In the early days, the CDC did not have an official name for the disease, often referring to it by way of the diseases that were associated with it, for example, lymphadenopathy, the disease after which the discoverers of HIV originally named the virus.They also used Kaposi's sarcoma and opportunistic infections, the name by which a task force had been set up in 1981. At one point, the CDC coined the phrase "the 4H disease", since the syndrome seemed to affect Haitians, homosexuals, hemophiliacs, and heroin users. In the general press, the term "GRID", which stood for gay-related immune deficiency, had been coined. However, after determining that AIDS was not isolated to the gay community,it was realized that the term GRID was misleading and the term AIDS was introduced at a meeting in July 1982.By September 1982 the CDC started referring to the disease as AIDS.

In 1983, two separate research groups led by Robert Gallo and Luc Montagnier independently declared that a novel retrovirus may have been infecting people with AIDS, and published their findings in the same issue of the journal Science.Gallo claimed that a virus his group had isolated from a person with AIDS was strikingly similar in shape to other human T-lymphotropic viruses (HTLVs) his group had been the first to isolate. Gallo's group called their newly isolated virus HTLV-III. At the same time, Montagnier's group isolated a virus from a person presenting with swelling of the lymph nodes of the neck and physical weakness, two characteristic symptoms of AIDS. Contradicting the report from Gallo's group, Montagnier and his colleagues showed that core proteins of this virus were immunologically different from those of HTLV-I. Montagnier's group named their isolated virus lymphadenopathy-associated virus (LAV).As these two viruses turned out to be the same, in 1986, LAV and HTLV-III were renamed HIV.

X-ray German physicist Wilhelm Rontgen is usually

credited as the discoverer of X-rays in 1895, because he was the first to systematically study them, though he is not the first to have observed their effects. He is also the one who gave them the name "X-rays" though many others referred to these as "Rontgen rays" (and the associated X-ray radiograms as, "Roentgenograms") for several decades after their discovery and even to this day in some languages, including Rontgen's native German . X-rays were found emanating from Crookes tubes, experimental discharge tubes invented around 1875, by scientists investigating the cathode rays, that is energetic electron beams, that were first created in the tubes. Crookes tubes created free electrons by ionization of the residual air in the tube by a high DC voltage of anywhere between a few kilovolts and 100 kV.

This voltage accelerated the electrons coming from the cathode to a high enough velocity that they created X-rays when they struck the anode or the glass wall of the tube. Many of the early Crookes tubes undoubtedly radiated X-rays, because early researchers noticed effects that were attributable to them, as detailed below. Wilhelm Rontgen was the first to systematically study them, in 1895.

ULTRA SOUND Acoustics, the science of sound, starts as far back as

Pythagoras in the 6th century BC, who wrote on the mathematical properties of stringed instruments. Sir Francis Galton constructed a whistle producing ultrasound in 1893. The first technological application of ultrasound was an attempt to detect submarines by Paul Langevin in 1917. The piezoelectric effect, discovered by Jacques and Pierre Curie in 1880, was useful in transducers to generate and detect ultrasonic waves in air and water. Echolocation in bats was discovered by Lazzaro Spallanzani in 1794, when he demonstrated that bats hunted and navigated by inaudible sound and not vision.

DAILY USE THINGS

NAIL CUTTER The inventor of the nail clipper is not exactly known, but

the first United States patent for an improvement in a finger-nail clipper (implying such a device already existed) seems to be in 1875 by Valentine Forget , Other patents for an improvement in finger-nail clippers are in 1876, William C. Edge , and , in 1878, John H. Holman . Filings for complete finger-nail clippers (not merely improvements) include, in 1881, Eugene Heim and Celestin Martz ,in 1885, George H. Coates (for a finger-nail cutter), and, in 1905, Chapel S. Carter (son of a Connecticut Baptist church deacon) patented a finger-nail clipper with a later patent in 1922.Around 1913, Carter was secretary of the H.C. Cook Company of Ansonia, Connecticut which was incorporated in 1903 as the H. C. Cook Machine Company by Henry C. Cook, Lewis I. Cook, and Chapel S. Carter . Around 1928, Carter was president of the company when, he claimed, about 1896, the "Gem"-brand finger nail clipper made its first appearance.

SCISSORS It is most likely that scissors were invented around

1500 BC in ancient Egypt. The earliest known scissors appeared in Mesopotamia 3,000 to 4,000 years ago. These were of the 'spring scissor' type comprising two bronze blades connected at the handles by a thin, flexible strip of curved bronze which served to hold the blades in alignment, to allow them to be squeezed together, and to pull them apart when released.

Spring scissors continued to be used in Europe until the 16th century. However, pivoted scissors of bronze or iron, in which the blades were pivoted at a point between the tips and the handles, the direct ancestor of modern scissors, were invented by the Romans around AD100.

They entered common use not only in ancient Rome, but also in China, Japan, and Korea, and the idea is still used in almost all modern scissors.

It is often reported that Leonardo da Vinci invented scissors. But the archaeological and historical evidence seems to contradict that claim. Usually no evidence, such as a drawing of scissors by da Vinci, is offered to support the claim, which is an example of an "unreal fact", similar to an urban myth.

HOME APPLIANCES

While many appliances have existed for centuries, the self-contained electric or gas powered appliances are a uniquely American innovation that emerged in the twentieth century. The development of these appliances is tied the disappearance of full-time domestic servants and to reduce the time consuming activities in pursuit of more recreational time. In the early 1900s, electric and gas appliances included clothes washers, water heaters and refrigerators and sewing machines. Post-World War II, the domestic use of dishwashers, and clothes dryers where part of a shift for convenience. As discretionary spending increased, it was reflected by a rise in miscellaneous home appliances

In America during the 1980s, the industry shipped $1.5 billion worth of goods each year and employed over 14,000 workers, with revenues doubling between 1982 and 1990 to $3.3 billion. Throughout this period companies merged and acquired one another to reduce research and production costs and eliminate competitors, resulting in anti-trust legislation. The United States Department of Energy passed the National Appliance Energy Conservation Act in 1987 which set energy standards that required manufacturers to reduce the energy consumption of the appliances by 25% every five years.

In 1987, home appliances the energy efficient standards helped every household save $2000 each or a total of $200 billion nationwide with the National Appliance Energy Conservation Act.

The origins of the word "gadget" trace back to the 19th century. According to the Oxford English Dictionary, there is anecdotal (not necessarily true) evidence for the use of "gadget" as a placeholder name for a technical item whose precise name one can't remember since the 1850s; with Robert Brown's 1886 book Spun yarn and Spindrift, A sailor boy’s log of a voyage out and home in a China tea-clipper containing the earliest known usage in print.[2] The etymology of the word is disputed.

A widely circulated story holds that the word gadget was "invented" when Gadget, Gauthier & Ice, the company behind the repose construction of the Statue of Liberty (1886), made a small-scale version of the monument and named it after their firm; however this contradicts the evidence that the word was already used before in nautical circles, and the fact that it did not become popular, at least in the USA, until after World War I

GADGETS

from the French machete which has been applied to various pieces of a firing mechanism, or the French gage, a small tool or accessory.

The October 1918 issue of Notes and Queries contains a multi-article entry on the word "gadget" (12 S. iv. 187). H. Tapley-Soper of The City Library, Exeter, writes:

A discussion arose at the Plymouth meeting of the Devonshire Association in 1916 when it was suggested that this word should be recorded in the list of local verbal provincialisms. Several members dissented from its inclusion on the ground that it is in common use throughout the country; and a naval officer who was present said that it has for years been a popular expression in the service for a tool or implement, the exact name of which is unknown or has for the moment been forgotten. I have also frequently heard it applied by motor-cycle friends to the collection of fitments to be seen on motor cycles.

DISCOVERY OF

ELECTRICITY

Long before any knowledge of electricity existed people were aware of shocks from electric fish. Ancient Egyptian texts dating from 2750 BC referred to these fish as the "Thunderer of the Nile", and described them as the "protectors" of all other fish. Electric fish were again reported millennia later by ancient Greek, Roman and Arabic naturalists and physicians. Several ancient writers, such as Pliny the Elder and Scribonius Largos, attested to the numbing effect of electric shocks delivered by catfish and torpedo rays, and knew that such shocks could travel along conducting objects. Patients suffering from ailments such as gout or headache were directed to touch electric fish in the hope that the powerful jolt might cure them. Possibly the earliest and nearest approach to the discovery of the identity of lightning, and electricity from any other source, is to be attributed to the Arabs, who before the 15th century had the Arabic word for lightning applied to the electric ray

Ancient cultures around the Mediterranean knew that certain objects, such as rods of amber, could be rubbed with cat's fur to attract light objects like feathers. Thales of Miletus made a series of observations on static electricity around 600 BC, from which he believed that friction rendered amber magnetic, in contrast to minerals such as magnetite, which needed no rubbing.Thales was incorrect in believing the attraction was due to a magnetic effect, but later science would prove a link between magnetism and electricity. According to a controversial theory, the Parthians may have had knowledge of electroplating, based on the 1936 discovery of the Baghdad Battery, which resembles a galvanic cell, though it is uncertain whether the artifact was electrical in nature.

Electricity would remain little more than an intellectual curiosity for millennia until 1600, when the English scientist William Gilbert made a careful study of electricity and magnetism, distinguishing the lodestone effect from static electricity produced by rubbing amber . He coined the New Latin word electricus ("of amber" or "like amber", from ἤλεκτρον, elektron, the Greek word for "amber") to refer to the property of attracting small objects after being rubbed. This association gave rise to the English words "electric" and "electricity", which made their first appearance in print in Thomas Browne's Pseudodoxia Epidemica of 1646.

FIRE

As the early Hominids moved from the tropics into colder regions, they needed to adjust to new often harsh regions conditions(see previous video about it-Survival During the Stone Age). Perhaps the most important to their to their ability is to adapt the use of fire. The systematic use of fire made it possible to provide a source of both light and heat within the cave to the structure which they live. The development of tools and the use of fire are the two important technological innovations/discovery during the Paleolithic period, reminds us how crucial the ability to adapt was to human survival. The most recent Ice Age began about 100,000 B.C. and reached it's coldest period between 20,000 and 10,000 B.C. Sheets of Thick Ice covered large parts of Europe, Asia, and North America.

Early Hominids experienced natural fires caused by lightning and learned the benefits of them. It was the Homo Erectus during these times who first to learn the use of Fire. Archeologists have discovered the piled remains of ashes in caves that prove that Paleolithic people used fire systematically as long ago as 500,000 years. As per discovered in the Homo Erectus Site at Choukoutien, in Northern China. During these times Archeologists have discovered hearths, ashes, charcoal, and charred bones. All of these were about 400,000 years old. The said cave is located/faced northeast, it would have been a dark place without the use of fire. Fire gave warmth and undoubtedly fostered a sense of community for the groups of people gathered around it. Fire was also created for protection of early humans to scare wild animals away.

They also use fire to cook food, wherein they have discovered that food cooked by fire is easy to chew, and it is better in tasting than eating raw foods. Scholars believe that the discovery of fire which is for starting of fires occurred independently throughout the world. Archeologists lack of evidence on how early peoples started fires. They have been able to examine the methods used by ancient people. On the basis archeologists assume that the earliest methods for starting fires were probably based on friction, such as rubbing two pieces of wood together. Dry Grass and leaves could be added as the wood began to smoke(see third illustration above). Eventually, paleolithic peoples devised sturdy, drill like wooden devices to start fires. Other early humans discovered that a certain stone which is called Iron Pyrites, when struck against a hard rock, gave off a spark that could also create fire. Paleolithic peoples used their technological innovations such as making tools and the use of fire to change their physical environment. By working together, they found a way to survive. In this case obviously Paleolithic peoples played a crucial role in human history.

THE END