Wednesday, December 25, 2019

Harriet Tubm A Biography - 1298 Words

James A. McGowan and William C. Kashatus the authors of Harriet Tubman :A biography focused on telling the remarkable story of Harriet Tubman the biography tells the story about her early years as young Arminta Ross, how she escape slavery, her duties as the Underground Railroad Conductor, and her roles in the Civil War. I feel like the authors purpose of writing this books was to educate and inform readers about the times of Harriet Tubman and what she had to endure and also to give a understanding on why Harriet Tubman is so legendary amongst the time she lived and why she still is today. Harriet Tubman: A Biography is centered around Harriet Tubman born in Maryland born into slavery. She was the daughter of Benjamin Ross her father†¦show more content†¦One year after the marriage Brodess plans to sell Harriet but he dies leaving his wife Eliza Brodess in debt. So to pay off the debt Eliza to plans to sale Harriet But this time Harriet thought she ll take matters in her own hands On September 17, 1849 Harriet and her two brothers Ben and Henry Escaped from slavery. But the family constantly argue over directions due to the lack of knowledge of directions the brothers decided to return to the plantation which also forced Harriet to go with them. Finally ran off on her on realizing that it would be too dangerous to reveal her plans her family. She traveled at night and hid during the day. She would get food from free dock workers the willingly provided her food and shelter and guided her through hidden tunnels and pathways. She was also helped by Hannah Levert on which whose was a white women . Leverton drove Harriet to Delaware border and gave her a piece of paper with the information of Thomas Garrett, a Quaker Abolitionist whose Underground railroad station was the last stop in Delaware before the Pennsylvania boundary dividing the free and slave states. Which leads to Harriet crossing the Mason-Dixon Line finally giving her freedom. Once in Philadelphia Harriet began making plans toShow MoreRelatedLeadership: Equality, Justice, Truth, and Freedom Essay examples1064 Words   |  5 Pageswe did not have those freedoms the only people that held jobs would be white males. There have been leaders in the past that have helped to give all people the freedom to be able to have jobs. Harriet Tubman was leader in her time helping to free slaves, she believed in the freedom of others. After Harriet Tubman helped free slaves in the Underground Railroad she pursued women’s rights, (Women in History, 2011). Conclusion In this paper the effort was put into trying to show that great leadersRead MoreHarriet Tubman Essay2304 Words   |  10 PagesHarriet Tubman By Blake Snider December 5, 2010 Professor J Arrieta Seminar Critical Inquiry Harriet Tubman is a woman of faith and dignity who saved many African American men and women through courage and love for God. One would ponder what would drive someone to bring upon pain and suffering to one’s self just to help others. Harriet Tubman was an African American women that took upon many roles during her time just as abolitionist, humanitarian, and a UnionRead MoreHistory of Civil Rights in America Essay1515 Words   |  7 Pagescreated ties to one another the Underground Railroad was formed to help slaves escape the South into free northern states and Canada. One of the most famous conductors of the Underground Railroad was Harriet Tubman. â€Å"In the 12 years from her escape in 1849 to the beginning of the Civil War in 1861, Harriet Tubman and the Underground Railroad became the most dominant force of abolitionism† (â€Å"Women in History†, 2012). She was known to many throughout the country as the â€Å"General† because of her daring

Tuesday, December 17, 2019

Comparing Sheila and Lady Macbeth´s Relationships with...

Comparing Sheila and lady Macbeth In this essay I will be comparing Sheila and lady Macbeths relationship to their husband, at the start of the play, Sheila and Gerald have known each other for some time, and they are here celebrating their engagement in which Sheila is really happy with, in the middle she starts to gain some power and by the end she is fully in power and tells her family what to do. Gerald comes from a rich, powerful, well-respected family. At the start of Macbeth, lady Macbeth is in control of the relationship, she starts as the man of the relationship to typical Jacobean women as she suicides and kill herself at the end. At the start of the play An inspector call, J.B Priestly presents Sheila and Gerald with a†¦show more content†¦Woman of her age at that time should be ruled by their father or by their husband. However throughout the beginning of the play Macbeth, lady Macbeth is seen to be in full charge. The fact that she planned the murder and her level of confidence tells us that she has no mercy on people. We fail? this quote used when Macbeth asks lady Macbeth what would happened if we fail?, She uses this rhetorical question to show off her confidence as well as her coldblooded nature. This link to how she is in a unusual type of relationship to Macbeth. In addition, Lady Macbeth is speaking as if she is certain that that this murder will go ahead. But she is fully aware that it is going to be emotionally difficult to go through with, and she calls upon evil spirits to assist her murderous plans.Come, you spirits, that tend on mortal thoughts, unsex me here and fill me from crown to the toe top full of direst cruelty; She does not want to have any human emotion in herself, as she knows that that will cause her to possibly back down and not go through with her plans, so she begs that she lose her inner feelings of conscience; This would be see as wired as the women as that time would not do anything such. From this we can understand that at the beginning Sheila and lady Macbeth have a contradictory and contrasting relationship with their husbands. On one hand Shelia is an immature and childish character whereas her husband, Gerald, is the

Monday, December 9, 2019

History of the Computer Industry in America Essay Example For Students

History of the Computer Industry in America Essay Only once in a lifetime will a new invention come about to touch every aspect of our lives. Such a device that changes the way we work, live, and play is a special one, indeed. A machine that has done all this and more now exists in nearly every business in the U. S. and one out of every two households (Hall, 156). This incredible invention is the computer. The electronic computer has been around for over a half-century, but its ancestors have been around for 2000 years. However, only in the last 40 years has it changed the American society. From the first wooden abacus to the latest high-speed microprocessor, he computer has changed nearly every aspect of peoples lives for the better. The very earliest existence of the modern day computers ancestor is the abacus. These date back to almost 2000 years ago. It is simply a wooden rack holding parallel wires on which beads are strung. When these beads are moved along the wire according to programming rules that the user must memorize, all ordinary arithmetic operations can be performed (Soma, 14). The next innovation in computers took place in 1694 when Blaise Pascal invented the first digital calculating machine. It could only add numbers and they had to be entered by turning dials. It was designed to help Pascals father who was a tax collector (Soma, 32). In the early 1800s, a mathematics professor named Charles Babbage designed an automatic calculation machine. It was steam powered and could store up to 1000 50-digit numbers. Built in to his machine were operations that included everything a modern general-purpose computer would need. It was programmed byand stored data oncards with holes punched in them, appropriately called punchcards. His inventions were failures for the most part because of the lack of precision machining techniques used at the time and the lack of demand for such a device (Soma, 46). After Babbage, people began to lose interest in computers. However, between 1850 and 1900 there were great advances in mathematics and physics that began to rekindle the interest (Osborne, 45). Many of these new advances involved complex calculations and formulas that were very time consuming for human calculation. The first major use for a computer in the U. S. was during the 1890 census. Two men, Herman Hollerith and James Powers, developed a new punched-card system that could automatically read information on cards without human intervention (Gulliver, 82). Since the population of the U. S. was increasing so ast, the computer was an essential tool in tabulating the totals. These advantages were noted by commercial industries and soon led to the development of improved punch-card business-machine systems by International Business Machines (IBM), Remington-Rand, Burroughs, and other corporations. By modern standards the punched-card machines were slow, typically processing from 50 to 250 cards per minute, with each card holding up to 80 digits. At the time, however, punched cards were an enormous step forward; they provided a means of input, output, and memory storage on a massive scale. For more than 50 years following heir first use, punched-card machines did the bulk of the worlds business computing and a good portion of the computing work in science (Chposky, 73). By the late 1930s punched-card machine techniques had become so well established and reliable that Howard Hathaway Aiken, in collaboration with engineers at IBM, undertook construction of a large automatic digital computer based on standard IBM electromechanical parts. Aikens machine, called the Harvard Mark I, handled 23-digit numbers and could perform all four arithmetic operations. Also, it had special built-in programs to handle logarithms and trigonometric unctions. The Mark I was controlled from prepunched paper tape. Output was by card punch and electric typewriter. It was slow, requiring 3 to 5 seconds for a multiplication, but it was fully automatic and could complete long computations without human intervention (Chposky, 103). The outbreak of World War II produced a desperate need for computing capability, especially for the military. New weapons systems were produced which needed trajectory tables and other essential data. In 1942, John P. Eckert, John W. Mauchley, and their associates at the University of Pennsylvania decided to build a high-speed electronic omputer to do the job. This machine became known as ENIAC, for Electrical Numerical Integrator And Calculator. It could multiply two numbers at the rate of 300 products per second, by finding the value of each product from a multiplication table stored in its memory. ENIAC was thus about 1,000 times faster than the previous generation of computers (Dolotta, 47). ENIAC used 18,000 standard vacuum tubes, occupied 1800 square feet of floor space, and used about 180,000 watts of electricity. It used punched-card input and output. The ENIAC was very difficult to program because one had to essentially re-wire it to perform whatever ask he wanted the computer to do. It was, however, efficient in handling the particular programs for which it had been designed. ENIAC is generally accepted as the first successful high-speed electronic digital computer and was used in many applications from 1946 to 1955 (Dolotta, 50). Mathematician John von Neumann was very interested in the ENIAC. In 1945 he undertook a theoretical study of computation that demonstrated that a computer could have a very simple and yet be able to execute any kind of computation effectively by means of proper programmed control without the need for any changes in hardware. Aids And Hiv EssayIn 1971 Marcian E. Hoff, Jr. , an engineer at the Intel Corporation, invented the microprocessor and another stage in the deveopment of the computer began (Shallis, 121). A new revolution in computer hardware was now well under way, involving miniaturization of computer-logic circuitry and of component manufacture by what are called large-scale integration techniques. In the 1950s it was realized that scaling down the size of electronic digital computer circuits and parts would increase speed and efficiency and improve performance. However, at that time the manufacturing methods were not good enough to accomplish such a task. About 1960 photoprinting of conductive circuit boards to eliminate wiring became highly developed. Then it became possible to build resistors and capacitors into the circuitry by photographic means (Rogers, 142). In the 1970s entire assemblies, such as adders, shifting registers, and counters, became available on tiny chips of silicon. In the 1980s very large scale integration (VLSI), in which hundreds of thousands of transistors are placed on a single chip, became increasingly common. Many companies, some new to the computer field, introduced in the 1970s programmable minicomputers supplied with software packages. The ize-reduction trend continued with the introduction of personal computers, which are programmable machines small enough and inexpensive enough to be purchased and used by individuals (Rogers, 153). One of the first of such machines was introduced in January 1975. Popular Electronics magazine provided plans that would allow any electronics wizard to build his own small, programmable computer for about $380 (Rose, 32). The computer was called the Altair 8800. Its programming involved pushing buttons and flipping switches on the front of the box. It didnt include a monitor or keyboard, and its applications were very limited (Jacobs, 53). Even though, many orders came in for it and several famous owners of computer and software manufacturing companies got their start in computing through the Altair. For example, Steve Jobs and Steve Wozniak, founders of Apple Computer, built a much cheaper, yet more productive version of the Altair and turned their hobby into a business (Fluegelman, 16). After the introduction of the Altair 8800, the personal computer industry became a fierce battleground of competition. IBM had been the computer industry standard for well over a half-century. They held their position as the standard when they introduced their first personal omputer, the IBM Model 60 in 1975 (Chposky, 156). However, the newly formed Apple Computer company was releasing its own personal computer, the Apple II (The Apple I was the first computer designed by Jobs and Wozniak in Wozniaks garage, which was not produced on a wide scale). Software was needed to run the computers as well. Microsoft developed a Disk Operating System (MS-DOS) for the IBM computer while Apple developed its own software system (Rose, 37). Because Microsoft had now set the software standard for IBMs, every software manufacturer had to make their software compatible with Microsofts. This would lead to huge profits for Microsoft (Cringley, 163). The main goal of the computer manufacturers was to make the computer as affordable as possible while increasing speed, reliability, and capacity. Nearly every computer manufacturer accomplished this and computers popped up everywhere. Computers were in businesses keeping track of inventories. Computers were in colleges aiding students in research. Computers were in laboratories making complex calculations at high speeds for scientists and physicists. The computer had made its mark everywhere in society and built up a huge industry (Cringley, 174). The future is promising for the computer industry and its technology. The speed of processors is expected to double every year and a half in the coming years. As manufacturing techniques are further perfected the prices of computer systems are expected to steadily fall. However, since the microprocessor technology will be increasing, its higher costs will offset the drop in price of older processors. In other words, the price of a new computer will stay about the same from year to year, but technology will steadily increase (Zachary, 42) Since the end of World War II, the computer industry has grown rom a standing start into one of the biggest and most profitable industries in the United States. It now comprises thousands of companies, making everything from multi-million dollar high-speed supercomputers to printout paper and floppy disks. It employs millions of people and generates tens of billions of dollars in sales each year (Malone, 192). Surely, the computer has impacted every aspect of peoples lives. It has affected the way people work and play. It has made everyones life easier by doing difficult work for people. The computer truly is one of the most incredible inventions in history.

Sunday, December 1, 2019

Kant Essays (1627 words) - Kantianism, Enlightenment Philosophy

Kant How does one label Kant as a philosopher? Is he a rationalist or an empiricist? Kant makes a distinction between appearances and things in themselves. He also says that things in themselves exist, and that we have no knowledge of things in themselves. This could be labeled CLOSE TO NONSENSE, but we know Kant better than that. No matter how many laps on the track of metaphysics Kant takes us through, he is still widely held as one of the greatest modern philosophers of our time. Let us explore the schools of rationalism and empiricism and compare his views with that of other rationalists and empiricists (mainly Hume), and see where he ends up on the finish line towards the nature of human knowledge. The term rationalism is used to designate any mode of thought in which human reason holds the place of supreme truth. Knowledge in this school of thought must be founded upon necessary truths (those that must be true and cannot be false); our ideas are derived from our experience; everything we experience is finite, but we do have the idea of infinity or else we couldn't conceive of things as finite. Descartes and Leibniz are well-known rationalists (handout on Rationalism versus Empiricism). Empiricism, on the other hand, is the concept that knowledge is grounded in experience, not reason, and our minds begin as a tabula rasa (term used by the great empiricist, John Locke meaning blank slate). Reason, for empiricists, can only process the ideas experience gives us. Knowledge is also founded on contingent truths (those that can be false and true); necessary truths are only good for organizing our ideas, as in mathematics, but that is all. There are also no innate ideas in empiricism; all of our ideas are built up from the raw materials given by our experience. Well-known empiricists include Locke, Berkeley, and Hume (handout on Rationalism versus Empiricism). So now that we know where the rationalists and empiricists generally stand, let us see where Kant generally stands. For Kant, human thought exist at three (closely interrelated and interconnected) levels (Ross, 2000). Sensibility conforms our perception of space and time. Understanding corresponds with our individual judgments regarding thought. Reason is the totality of our judgments. Their relationship is crucial in Kant's theory of the thing in itself. The thing in itself is the product of our mind's commitment to thinking about the phenomena (the items of our experience) as appearances (Ross, 2000). It might seem inappropriate to describe Kant as an empiricist. He believed, contrary to the basic empiricist principle, that there are important propositions that can be known independently of experience. He devoted, virtually all of his efforts as researcher to discovering how it is possible for us to have a synthetic a priori knowledge. However, Kant also believed that there are some things that we can know only through sensory experience as well. Kant appears to have left experience in charge of our knowledge (Ross, 2000). But, let us not concede yet. In Kant's Critique of Pure Reason (Transcendental Deduction), in the middle of his argument for why certain concepts would be necessary and known a priori with respect to experience, Kant realized that synthesis would have to produce, not just a structure of thought, but the entire structure of consciousness within which perception also occurs. He says that what is first given to us is appearance, and then combined with consciousness we have perception. It is the structure of consciousness that turns appearances into objects and perceptions, without which they would be nothing. Kant made synthesis a function of imagination rather than thought, though this creates its own confusions. Synthesis therefore brings things into consciousness, making it possible for us to recognize that our consciousness exists and that there are things in it (Ross, 2000). Let us now briefly look at Kant and his position with rationalism. Kant always believed that reason connected us directly to things in themselves. Kant's notion that reason connects us directly to things in themselves does not allow for metaphysics as practiced by the rationalists because reason alone does not determine any positive content of knowledge (Ross, 2000). Kant's theory as one of empirical realism is still very