Saturday, November 30, 2019
Puryfying Used Cooking Oil free essay sample
The researchers are trying to figure out the effects of sedimentation, activated carbon, and decantation and boiling on purifying used coconut, palm and vegetable oil. The researcherââ¬â¢s experiment resulted to the change of appearance, odour and viscosity of each type of oil. The now purified cooking oil is faster to heat which makes cooking easier, faster and more efficient. The purified oil is quite beneficial however it does not take in as much taste as the unpurified ones. Overall the experiment was very successful in terms of finding the positive differences in each type of oils. The vegetable oil was the best product of all the processes because it had shown the best improvement in all aspects including appearance, odour and viscosity compared to the coconut and the palm oil in the experiment. This study can benefit people who love to cook. Acknowledgements The researchers would like to thank the following for making this study successful: â⬠¢The Anico family for openly welcoming the researchers into their home without hesitation. We will write a custom essay sample on Puryfying Used Cooking Oil or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page â⬠¢Ms. Michelle Baldevarona for being patient in helping with the SIP in every step of the way. â⬠¢Most of all, the Heavenly Father for blessing the researchers with minds that are capable of interpreting the information taught and transferring it to useful knowledge. Chapter 1 Introduction Background of the Study In the commercial world of fast food restaurants and Filipino homes, lessening expenses is one their main goals. Most fast food restaurants, such as Jollibee and McDonaldââ¬â¢s, try to lessen expenses by reusing cooking oil. More than not, they reuse cooking oil without making sure that it is still sanitary and healthy to use in cooking. Because most Filipino dishes include the use of cooking oil, it is a primary ingredient in many dishes. Therefore, many health concerns are raised, such as increase of cholesterol due to the reused fats present when cooking oil is reused without ensuring its sanitary and nutritional value. Cooking oils undergo a complex series of changes and reactions during heating and frying. Used cooking oils could be purified by removing the odour, undesirable taste and colour substances. Activated carbon, the process of decantation, sedimentation and boiling are potential means of improving the quality of the used edible cooking oils. Statement of the Problem Will sedimentation, activated carbon, boiling and decantation purify coconut, palm and vegetable oil? Which oil is the best product from the purification processes? Hypothesis Purified cooking oil is equitable with unpurified cooking oil in terms of content and quality such as appearance, odour and viscosity. Definition of Terms Activated Carbon is a form of carbon processed to be riddled with small, low-volume pores that increase the surface area available for adsorption or chemical reactions. Coconut Oil an edible oil extracted from the kernel or meat of matured coconuts harvested from the coconut palm Decantation is a process for the separation of mixtures, by removing a top layer of liquid from which a precipitate has settled. Palm Oil ââ¬â is an edible vegetable oil derived from the mesocarp (reddish pulp) of the fruit of the oil palms Sedimentation ââ¬â natural process where solid materials sink to the bottom given a period of time Vegetable Oil is a triglyceride extracted from a plant Significance of the Study The study will benefit people who use cooking oil to prepare meals. This will not only save them money but it can also ensure them that their food would still be edible due to the fact that reused cooking oil can easily become rancid (spoiled) and deteriorated to the point it produces undesirable flavours and odours. Besides ruining what would have been a perfectly good meal, rancid oils also contain free radicals that are potentially carcinogenic. Scope and Limitation This study covered the purification of used cooking oils through the use of activated carbon, the process of decantation and of boiling. The researchers used vegetable oil, palm oil and coconut oil to be experimented on. Variables such as the amount of cooking oil used, the length it took to coo, the temperature and food used to cook were controlled. On the other hand, the manipulated variables were the types of cooking oil. Chapter 2 Review of Related Literature Uses and Effects Filipinos are fond of using cooking oil in their homes. They are also conscious of saving money by reusing these oils. But when cooking oils are reused without purifying it, some health hazards may occur. One of these is the formation of 4-hydroxy-trans-2-nonenal (HNE) which is due to the food particles left from the previous food cooked which are reheated again. HNE can cause cardiovascular disease, stroke, various liver disorders, and cancer. Activated Carbon Activated carbon is a form of carbon processed to be riddled with small, low-volume pores that increases its absorption of liquids when passed through it. This can remove the unwanted food particles and further purify it. Sedimentation Sedimentation is the tendency for particles in suspension to settle out of the fluid in which they are entrained, and come to rest against a barrier. Decantation Decantation is a process for the separation of mixtures, by removing a top layer of liquid from which a precipitate has settled. Usually a small amount of solution must be left in the container, and care must be taken to prevent a small amount of precipitate from flowing with the solution out of the container. It is frequently used to purify a liquid by separating it from a suspension of insoluble particles. Coconut Oil Coconut oil is an edible oil extracted from the kernel or meat of matured coconuts harvested from the coconut palm (Cocos nucifera). It has various applications in food, medicine, and industry. Coconut oil is commonly used in cooking, especially for frying and is a common flavor in many South Asian curries. It has been used for cooking (in tropical parts of the world) for thousands of years. Coconut oil is used by movie theatre chains to pop popcorn, adding a large amount of saturated fat in the process. Palm Oil Palm oil (also known as dende oil, from Portuguese) is an edible vegetable oil derived from mesocarp (reddish pulp) of the fruit of the oil palms. Palm oil is naturally reddish in color because of a high beta-carotene content. It is not to be confused with palm kernel oil derived from the kernel of the same fruit, or coconut oil derived from the kernel of the coconut palm (Cocos nucifera). The differences are in color (raw palm kernel oil lacks carotenoids and is not red), and in saturated fat content: Palm mesocarp oil is 41% saturated, while Palm Kernel oil and Coconut oil are 81% and 86% saturated respectively. Vegetable Oil A vegetable oil is a triglyceride extracted from a plant. Such oils have been part of human culture for millennia. The term vegetable oil can be narrowly defined as referring only to substances that are liquid at room temperature, or broadly defined without regard to a substances state of matter at a given temperature. For this reason, vegetable oils that are solid at room temperature are sometimes called vegetable fats. Viscosity The viscosity of a fluid is a measure of its resistance to gradual deformation by shear stress or tensile stress. For liquids, it corresponds to the informal notion of thicknessâ⬠. Viscosity is due to the friction between neighbouring particles in a fluid that are moving at different velocities. Chapter 3 Methodology Subject of the Study This study made us of three kinds of cooking oil which are mainly used in Filipino homes. These cooking oils are namely vegetable oil, palm oil and coconut oil. Materials ? cup of vegetable oil ? cup of palm oil ? cup of coconut oil Activated carbon Bottle where the decantation process will take place Pot where boiling can happen Procedures 1)After the cooking oil has been used, let the oil stand for a while so that the food particles can settle at the bottom. 2) Pour it through a bottle which in the middle contains activated carbon and has holes at the bottom for the oil to pass through. 3)When youve removed the solids and particulates, pour an amount of water equal to the volume of oil into a large pot or kettle with deep sides. Pour in your oil. Add about 1/2 teaspoon of salt per quart of total liquid to the pot. 4)Bring the oil and water mixture to the boil, and then boil it hard for about 5 to 10 minutes. The darker, more scorched, and/or more strongly flavored the oil, the longer you should boil the mixture. 5)Remove from the heat, and set aside to settle out. It takes about 10 to 30 minutes for the oil to completely separate and come to the top. 6)Pour off the water portion as completely as possible and discard. 7)Put the oil portion back into the deep pot or kettle. Over medium heat, bring to the boil (which for oil is hotter, obviously, than for water. Reduce heat until it is boiling and popping a bit, but slowly. The goal here is to evaporate all of the retained water. When the oil becomes very clear looking and no longer makes any sound (no popping or sizzling sounds), has no more bubbles rising, and no more steam comes to the top, its done. 8)Allow to cool, then bottle in an airtight container for reuse. Chapter 4 Presentation, Analysis and Interpretation of Data Table 1: Observation after Cooking Oils Used and After the Purification Process Type of Cooking OilAppearanceOdour Unpurified Vegetable OilLooks like regular cooking oilContains a subtle scent of something burnt Purified Vegetable OilVery clear yellow colorSmells like original vegetable oil prior to being used Unpurified Palm OilVery dark yellowish-brown colorContains a subtle scent of hotdog Purified Palm OilSlightly lighter shade than previous colorStill contains a subtle scent of hotdog Unpurified Coconut OilMurky yellowish-brown colorContains a subtle scent of hotdog Purified Coconut OilMurky light yellow color Contains a subtle scent that canââ¬â¢t be indentified Major variations were observed before the oils were cooked and after the oils were purified with the processes of sedimentation, activated carbon, decantation and boiling using 6 regular sized hotdogs which were cooked with the temperature between 180 degrees Celsius to 190 degrees Celsius (medium heat) in the span of 5 minutes. Table 2: Viscosity Test Type of Oil1st Trial2nd Trial3rd TrialAverage Unpurified Coconut Oil1. 28 secs0. 98 secs1. 20 secs1. 15 secs Unpurified Palm Oil0. 99 secs1. 15 secs1. 18 secs1. 11 secs Unpurified Vegetable Oil1. 10 secs0. 97 secs1. 15 secs1. 07 secs Purified Coconut Oil 0. 95 secs1. 03 secs0. 91 secs0. 96 secs Purified Palm Oil1. 10 secs0. 93 secs0. 96 secs1. 00 secs Purified Vegetable Oil0. 97 secs1. 06 secs0. 94 secs 0. 99 secs The test was done with 100ml of different cooking oils, both purified and unpurified. The weight dropped into each of the container weighed 50grams. The most viscous liquid was the unpurified cooking oils and the unpurified was the lesser viscous one. The viscosity of the cooking oils affects how fast it takes to heat it. The more viscous a substance is, the longer it will it take to heat it. Although less viscous oils are faster to heat, foods cooked in it do not take in as much as the taste of the oil compared to the more viscous one. The purified oils were less viscous due to the purification processes done with it. Chapter 5 Conclusion and Recommendation Summary The outcome of the researchersââ¬â¢ purification of the cooking oils deemed to be successful. These positive differences were proven in the observation table and viscosity test. The three oils used, mainly coconut, palm and vegetable, all showed these differences in their appearance, odour, and viscosity. Conclusion Our experiment proved to be successful in terms of finding key differences in both the used and purified cooking oils. Based on the tables presented; there were significant differences between the unpurified cooking oil and the purified cooking oil in terms of its colour, odour and viscosity. The purified cooking oils; however, proved to be more beneficial in those different aspects. But overall, it was the vegetable oil that was the best product of our purification process because it had the most improvement in appearance, odour and had the lesser change in viscosity compared to the other two. Recommendations For future use and investigation, the researches recommend that a wider variety of oils, such as olive, canola and peanut, should be tested. The researchers also recommend that more purification processes should be tested with the oils to further purify it. And lastly, a nutritionist is recommended to tests the nutritive value of these oils whether or not they have changed after being used and after purifying it.
Tuesday, November 26, 2019
The Distance Between Degrees of Latitude and Longitude
The Distance Between Degrees of Latitude and Longitude What is the precise location of Los Angeles? It can be stated in relative terms (about 3,000 miles west of New York, for example), but for a cartographer, pilot, geologist, or geographer, a much more specific measurement is needed. In order to precisely locate any spot in the world, therefore, we use a geographic coordinate system that is measured in degrees of latitude and longitude. This system starts with an imaginary grid of lines that cover the entire planet. Locations are measured based on both X and Y coordinates within the grid. Because the Earth is round, however, the distances between lines on the grid vary. Defining Latitude and Longitude Longitude is defined as imaginary lines called meridians that run from the north to the south pole. There are a total of 360 meridians. The Prime Meridian, which runs through the Greenwich Observatory in England, is also called the International Date Line. Every location east of this line is one day earlier than every location west of the line. Latitude is defined as imaginary lines called parallels because they are parallel to the equator and to one another. The equator, which runs in a circle around the center of the Earth, divides the planet into north and south hemispheres. Lines of latitude and longitude intersect, creating a grid that allows anyone in any location to pinpoint a geographic location. There are 360 degrees of longitude (because meridians make Great Circles around the globe), and there are 180 degrees of latitude. To further specify exactly where to find anything on Earth, measurements are stated not only in degrees but also in minutes and seconds. Each degree can be broken into 60 minutes, and each minute can be divided into 60 seconds. Any given location can be described in terms of degrees, minutes, and seconds of longitude and latitude. What Is the Distance Between Degrees of Latitude? Degrees of latitude are parallel so, for the most part, the distance between each degree remains constant. However, the Earth is slightly elliptical in shape and that creates a small variation between the degrees as we work our way from the equator to the north and south poles. Each degree of latitude is approximately 69 miles (111 kilometers) apart.At the equator, the distance is 68.703 miles (110.567 kilometers).At the Tropic of Cancer and Tropic of Capricorn (23.5 degrees north and south), the distance is 68.94 miles (110.948 kilometers).At each of the poles, the distance is 69.407 miles (111.699 kilometers). This is rather convenient when you want to know how far it is between each degree, no matter where you are on Earth. All you need to know is that each minute (1/60th of a degree) is approximately one mile. For example, if we were atà 40 degrees north,à 100 degrees west, we would be on the Nebraska-Kansas border. If we were to goà directly north toà 41 degrees north,à 100 degrees west, we would have traveled about 69 miles and would now be near Interstate 80. What is the Distance Between Degrees of Longitude? Unlike latitude, the distance between degrees of longitude varies greatly depending upon your location on the planet. They are farthest apart at the equator and converge at the poles. A degree of longitude is widest at the equator with a distance of 69.172 miles (111.321 kilometers).The distance gradually shrinks to zero as they meet at the poles.At 40 degrees north or south, the distance between a degree of longitude is 53 miles (85 kilometers). The line at 40 degrees north runs through the middle of the United States and China, as well as Turkey and Spain. Meanwhile, 40 degrees south is south of Africa, goes through the southern part of Chile and Argentina, and runs almost directly through the center of New Zealand. Calculate the Distance from One Point to Another What if you are given two coordinates for latitude and longitude and you need to know how far it is between the two locations?à You could use what is known as a haversine formula to calculate the distance - but unless you are a whiz at trigonometry, it is not easy. Luckily, in todays digital world, computers can do the math for us. Most interactive map applications will allow you to input GPS coordinates of latitude and longitude and tell you the distance between the two points.à There are a number of latitude/longitude distance calculators available online. The National Hurricane Center has one that is very easy to use. Keep in mind that you can alsoà find the precise latitude and longitude of a location using a map application. In Google Maps, for example, you can simply click on a location and a pop-up window will give latitude and longitude data to a millionth of a degree. Similarly, if you right-click on a location in MapQuest you will get the latitude and longitude data. Source Latitude/Longitude Distance Calculator. National Hurricane Center and Central Pacific Hurricane Center.
Friday, November 22, 2019
Proton - Definition of Physics Terms
Proton - Definition of Physics Terms A proton is a positively charged particle that resides within the atomic nucleus. The number of protons in the atomic nucleus is what determines the atomic number of an element, as outlined in the periodic table of the elements. The proton has charge 1 (or, alternately, 1.602 x 10-19 Coulombs), the exact opposite of the -1 charge contained by the electron. In mass, however, there is no contest - the protons mass is approximately 1,836 times that of an electron. Discovery of the Proton The proton was discovered by Ernest Rutherford in 1918 (though the concept had been earlier suggested by the work of Eugene Goldstein). The proton was long believed to be an elementary particle until the discovery of quarks. In the quark model, it is now understood that the proton is comprised of two up quarks and one down quark, mediated by gluons in the Standard Model of quantum physics. Proton Details Since the proton is in the atomic nucleus, it is a nucleon. Since it has a spin of -1/2, it is a fermion. Since it is composed of three quarks, it is a triquark baryon, a type of hadron. (As should be clear at this point, physicists really enjoy making categories for particles.) Mass: 938 MeV/c2 1.67 x 10-27 kgCharge: 1 fundamental unit 1.602 x 10-19 CoulombsDiameter: 1.65 x 10-15 m
Thursday, November 21, 2019
Methods & Survey Research Designs Coursework Example | Topics and Well Written Essays - 2000 words
Methods & Survey Research Designs - Coursework Example Positive Relationships: Positive relationship is a relationship that signifies a direct relationship among two variables. That is, when there is an increase in one variable, the other variable is also likely to increase and when one variable decreases, the other also decreases. Negative (Inverse) Relationships: A negative relationship means that increase in the value of one variable leads to decrease in the value of the other variable and vice versa. This relation is also known as an inverse relationship. Pilot Test: A pilot test is a minor version of a large survey test and it is carried out to get an idea of the real test. It involves prior testing of a research tool, for instance, a new information gathering method, and it can also be used to test a hypothesis or design. Critical Theory: It is a social theory aiming toward analyzing and critiquing the society as a whole, in disparity with traditional theory aimed only to explaining it. Critical theories intend to dig under the she ll of social life and expose various theories that render a true and fuller understanding as to how the world works. Cultural Portrait: Cultural Portrait can reflect high moral and spiritual human qualities. It also has the capacity of honestly revealing the negative qualities of the subjects under study. Cultural portraits are mainly common in satirical portraits and caricatures. Bounded System: A bounded system has territories with identifiable edges between the interior and exterior, as well as spaces with different functions happening in different spaces. Examples include an organization, a family, a program or a class in school etc. Discriminate Sampling: It is a procedure which decides the group to which a person belongs according to his or her individual characteristics. Gatekeeper: A gatekeeper in traditional research methodology is a person with whom the researcher has to negotiate entree to participant subjects. The role implies a related position such as, stewardship, own ership or other executive authority along the lines of the presented cultural standards of the research setting. In Vivo Codes: In vivo codes are the factual terms used by researchers mostly as well as the expressions used by various actors also. They tend to be the behaviors which will provide details to the analyst about the methods in which the basic problems of the actors is determined. Memoing: Memoing is the process of recording reflective notes concerning what the analyst learns from the data. Memos accrue as written records or ideas regarding concepts as well as their relations. Progressive-Regressive Method: The progressive regressive method includes a movement in two guidelines. The progressive stage starts with what is clear to examination and the regressive stage returns back to its older roots. Inductive Reasoning: The term inductive reasoning means to analyze starting from bottom up. It takes exact data and creates a broader overview that is considered possible, allowi ng for the information that the end may not be precise. Field Notes: Field notes refer to different notes recorded by researchers throughout or after their study of a specific subject they are engaged in. They are mainly treasured in descriptive sciences that have high implications in this area. Field Journal: The field journal is a notebook that a researcher uses to record personal notes, observational notes, sketches, lists of terms, ideas and so on, when he or she is engaged in a field
Tuesday, November 19, 2019
In what ways did industrialization create new opportunities for women Essay
In what ways did industrialization create new opportunities for women How and why were these opportunities limited - Essay Example Important changes occurred included the elevation of women positions and set up of more demanding jobs. Industrialization shifted American economy from agricultural to an economy characterized by workforce enabling many women to enter the paid jobs. This was possible as women worked in textile industries, mining and agriculture. A change in industrial configuration lessened the number women labors. Opportunities like being employed as a miner were then outlawed by the regime as being illegitimate for woman to toil as a drawer in coal excavating. Another impediment to the employment of women came from gender partition of labor. Gender defined the role of women and women discretely (Hillstrom et al., 205). The cultural devaluing of women household jobs camouflaged its continuation leading to decrease in economy importance. The working place was another cause to limit the opportunities. That was possible when the working places were far away from women homes. Therefore, women were unable to comfortably work and at the same time taking care of their children (Sylvia 2008). Women who got married would rather stay home and look after children due to undue influence from their husbands and core cultural values that need to be
Saturday, November 16, 2019
Savings and Loans Crisis Essay Example for Free
Savings and Loans Crisis Essay INTRODUCTION In the 1980ââ¬â¢s, the savings and loan (SL) industry was in turmoil with the watershed event of this being the implementation of price fixing legislation in favour of home ownership in the 1930ââ¬â¢s. Even though it was the basis of the crisis, the trigger lies in more fundamental concepts, including fiscal policy, mismanagement of assets and liabilities, pure imprudence by SL institutions, brokered deposits and the cyclicality of the regulation/deregulation process and this was fuelled by economic reactions such as inflation. It would be ââ¬Ëunfairââ¬â¢ to attribute it to only one factor. Therefore, to properly investigate the crisis and with a view of having all round perspective of the crisis, this report will discuss this financial disasterââ¬â¢s main causes. The impact of the crisis was borne mostly by the SL industry, the savings and commercial banks in the US and more generally, the US economy. This report will further cover the corrective measures undertaken by regulators and the government with the aim of saving the SL sector as the number of institutions with worsening financial conditions steeply increased. The consequences of this crisis persisted until the early 1990ââ¬â¢s and this long term effect is understood by analysing the regulations enacted in the aftermath of the crisis. The main turning point has been the enactment of the Financial Institutions Reform, Recovery and Enforcement Act in 1989. Finally, there are essential lessons to be learned from the SL crisis, not only for the SL institutions, but also the banking industry, regulators and the government. CAUSES In the 1930s the SL industry was a conservative residential mortgage sector surrounded by legislation put in place during that period to promote home ownership. At the same time it has its own regulator which is the federal savings and home loan banking loan, and its own insurance firm to insure deposits at SL institutions. However the regulatory and interest rate environment started to change dramatically as from the 1960s when congress applied the Regulation Q to the SL industry by putting a ceiling on the interest rate that SLs can pay to depositors. The purpose was to help thrift institutions to extend interest rate ceiling to them in order to reduce their cost of liabilities and protect them from deposit rate wars since there were inflationary pressures in the middle till late 1960s. Regulation Q was price fixing, and in trying to fix the prices, Regulation Q caused distortion where the costs outweigh any benefits it may have offered. Regulation Q created a cross subsidy, passed from saver to home buyer, that allowed SLs to hold down their interest costs and thereby continue to earn, for a few more years, an apparently adequate interest margin on the fixed-rate mortgages they had at that recent past years. The problem was that the SL industry was not competing effectively for funds with commercial banks and securities market leading to large things in the amount of money available for mortgage lending. The ceiling on interest rate that SL could offer to depositors as per the Regulation Q led dampening of competition for depositors funds between bank and SL. But as new money market funds began to compete fiercely during the 1970s for depositorsââ¬â¢ money by offering interest rates set by the market, SLs suffered significantly withdrawal of deposits during periods of high interest rates. This caused outflows from financial institution into higher yielding investment such as capital market instrument, government securities and money market funds. This process is known as disintermediation. Disintermediation has several undesirable consequences. Most important, it both restricted the availability of credit to consumers and increased its cost, particularly for home mortgages, the same consequences affected small and medium sized businesses that did not have access to the commercial paper market. In additional, because normal cash outlays increased to meet deposit withdrawals while cash inflows decreased as new funds were diverted to alternative investments, disintermediation slowed the growth of financial institutions and caused them liquidity problems. To have the cash available to meet withdrawal demands, banks and thrifts were often forced either to borrow money at above-market interest rates or to sell assets, often at a loss from book value. At the same time, rise in oil prices in 1979 pushed inflation and headline interest rates up. Growing inflation in the 1970s received two huge boosts: the first comprised the late-1973 and 1979 oil shocks from OPEC (the Organization of Petroleum Exporting Countries). Soaring oil prices compelled most American businesses to raise their prices as well, with inflationary results. The second boost to inflation came in the form of food harvest failures around the world, which created soaring prices on the world food market. Again, U.S. companies that imported food responded with an inflationary rise in their prices. In order to combat the increase in inflation, there was a rise in interest rates to encourage people to save and spend less. The Federal Reserve opted for tightening monetary measures in reaction to inflationary concerns. As a result of the subsequent monetary tightening, interest rates rose abruptly and significantly. Interest rates soared from 9.06% in June 1979 to 15.2% in March 1980. Such drastic change in base rates caused the yield curve to become inverted. The spread between the 10 year Treasury bond and the 3-month T-Bill became negative as seen in the table below reaching 373 basis points in 1980. (http://www.milkeninstitute.org/pdf/InvrtdYieldCurvesRsrchRprt.pdf) The graph below shows the variation of US Treasury three-month T-Bill. The large rise and the volatility of short term interest rates is evident from the graph. (http://www.milkeninstitute.org/pdf/InvrtdYieldCurvesRsrchRprt.pdf) The following 10-year Treasury against the effective Federal Funds Rate spread also illustrates how the yield curve inverted during the SL crisis. (http://www.milkeninstitute.org/pdf/InvrtdYieldCurvesRsrchRprt.pdf) With high volatility of interest rates during these periods, the SL industry failed to tackle the risk inherent in the funding of long term, fixed mortgages by means of short term deposits. In other words, there was a mismatch of asset/liability with a negative gap and rising short term interest rates. Aftermath In the1982ââ¬â¢s, to attempt at resuscitating the SL industry, Congress tried to deal with the crisis by enacting the Depository Institutions Deregulation and Monetary Control Act in 1980 and the Garn-St Germain Depository Institutions Act in 1982, allowed lower capital requirements, which were based largely on book values rather than more market-value oriented techniques, grossly overstate the health of financial institutions. Regulators relaxed regulatory restrictions by decreasing the net worth requirement from 4% to 3% of total deposits, with additional flexibility of not complying with the generally accepted accounting principles (GAAP). The process of deregulation further included the extension for the period of amortisation of supervisory goodwill and the Bank Board removes the specific limitations for the SL shareholders, changing the minimum 400 shareholders restriction to only one, with no one shareholder or group holding more than 10% and 25% respectively and the acceptance of means of payment other than cash. In particular, rules on net worth changed so that thrifts could continue to operate even at historically low levels. Also, ââ¬Å"supervisory goodwillâ⬠was used to balance out the books in terms of capital requirements and accounting numbers. This goodwill had no economic sense and simply helped to compensate any institutions, in a merger, when taking over economically impaired assets of insolvent institutions. All in all, the changes in accounting and capital treatment of supervisory goodwill enabled SLââ¬â¢s to post stronger accounting and capital numbers even though the underlying economic situation had deteriorated. There was a cancellation of the ceiling of the loan to value ratio as well. Forbearance or the decline in regulatory oversight was also a major factor of the debacle. Most importantly, savings and loan interest rate ceilings were removed. SLââ¬â¢s had a large proportion of variable rate liabilities (deposits) funding fixed-rate assets they held 84.5% of their assets as home mortgages. These institutions had a negative GAP as the amount of RSL was larger than that of RSA. GAP = RSA RSL Therefore, they were exposed to any rise in interest rates as the yield on the assets were fixed while the cost of liabilities increased. With the rapid increase in base rate in the 1980ââ¬â¢s, FIââ¬â¢s cost of RSL rose faster than they could adjust their return on their assets. They had to maintain a high level of interest paid on deposit to avoid deposit withdrawal. The Net Interest Income ââ¬â the difference between interest on assets and cost of liabilities decreased significantly. NII = Asset Return ââ¬â Cost of Liabilities On average, the returns on home loans were 9% with an average deposit rate of 7% which implied a 2% net interest income. Given the tight regulations surrounding the SLââ¬â¢s, these institutions relied in the 2% net interest income as their main source of income. However, as the base rate rose dramatically, the NII dropped to negative figures, reaching -1.0% in 1981. Many institutions lost huge amounts of money. Savings and Loans specialised in originating and holding home mortgage loans that were relatively long term assets with fixed interest rates. However, these were funded by relatively short term deposits whose interest rates were variable. There was a maturity mismatch that was exposed to risk of interest rate rise. With the market value of the assets being more volatile because of its longer maturity, and as a consequence a longer duration, the rise in interest rate decreased the value of the mortgages to very low levels. The value of the liabilities decreased as well but to a smaller extent. Since net worth is the difference between market value of assets and market value of liabilities, this led to negative equity of financial institutions. Ãâ E = (DA DLg) x A x Ãâr/(1+r) Since DA DLg, with Ãâr 0, change in net worth value ÃâE is negative. The leverage adjusted duration gap between the assets and liabilities was so large and with a large rise in interest rate, the equity value decreased to being negative. By the early 1980s, savings and loans throughout the country were insolvent by about $110 billion and the fund was reporting only $6 billion in reserves (Barth, 1991; Brumbaugh, 1988; Kane, 1989) The legislation also allowed savings and loans to begin to diversify into commercial real estate loans and other loans commercial banks could already make. Congress hoped that if SLââ¬â¢s invested in riskier, and thus, higher yielding assets, they would be able to offset the loss they previously made. The savings and loans were also allowed to originate adjustable-rate home loans. By 1983, most SLs were deemed economically profitable but 9% of the SL industry was insolvent. However, the Federal Home Loan Bank Board (FHLBB) and the Bank Board, went ahead with their plan of regulating the industry by imposing a 7% net worth limit for new entrants in the thrift industry so as to promote safe risk management practices and investments. Although all these developments were intended to help savings and loans, they gave rise to a subsequent twist in the crisis. The new changes did indeed allow savings and loans to reduce their interest rate risks but the changes exposed savings and loans to new risks mainly credit risks. While defaults on the home mortgages were low, defaults and associated losses on other types of loans and investments are typically much higher. By combining interest rate risk with credit risk, spread over a wider geographical area, experienced institutions had greater opportunities to choose a prudent overall balance of risk and return. However, many savings and loans began making commercial real estate loans, activities in which they were relatively inexperienced. Since investing in real estate loans entailed unique risks and required specific skills, SLââ¬â¢s eventually made losses on the real estate loans. These credit quality problems are reflected in the net income of the industry plunging once again, but even more than in the early 1980s, when the yield curve inverted. The industry lost nearly $21 billion in 1987 and 1988, and almost another $8 billion in 1989. Many open but insolvent savings and loans had incentives to take excessive risks and ââ¬Å"gambled for resurrectionâ⬠because of the phenomenon of moral hazard. If ever something turned wrong, the federal deposit insurance fund would bear the losses; yet the owners would reap the rewards if everything went well. The legislation, however, did not change how premiums were set for federal deposit insurance, meaning that riskier institutions and prudent ones were charged the same premium. Actually, the level of insured deposits was raised from $40,000 to $100,000. The new, lower capital requirements and broader opportunities to lend and invest allowed some savings and loan to take larger risks. With federally insured deposits and the ability to attract more deposits by offering higher rates of interest, deeply troubled savings and loans always had ready access to additional funds. Deregulation encouraged increased risk-taking by SLââ¬â¢s. However, in the mid- to late 1980s, with considerable real estate loans and investments, regional recessions struck the USA, which reduced commercial real estate values. In particular, an unexpected plunge in the price of oil in 1986 contributed to recession. To make matters worse, the Congress passed the Tax Reform Act of 1986 that more than eliminated the tax benefits to commercial real estate ownership it had conveyed only a few years earlier. Commercial real estate values fell dramatically as a result. This severely affected the asset value of the SLââ¬â¢s. In 1987, the Bank Board emphasised the importance of capitalisation by imposing a supervisory approval for SLs which engage in investments that are above 2.5 the multiplier of their tangible capital base. The main turning point was the Financial Institutions Reform Recovery and Enforcement Act (FIRREA), restructuring the industry as a whole by setting up the Resolution Trust Corporation which in total resolved or liquidated 747 thrifts, with assets valued at $394 billion, jettisoning both the FHLBB and FSLIC and setting up a new regulatory institution Office of Thrift Supervision. The key to this act was that instead of trying to save the SLs which were barely solvent, it dissolved them and focused on the solvent ones to reform the industry. With the assistance of market fundamentals ââ¬â favourable conditions of interest rates, the reinstatement of GAAP accounting and real estate market, the industry stabilised. LESSONS LEARNT The thrift crisis had a bailout plan of $153 billion, of which around 80% was financed by taxpayers. The number of institutions in the SL industry receded considerably until 1995 and before then, the ability of the regulators and the government to deal with the crisis was questioned many times. What followed was a series of court battles, corruption charges and major restructuring. Therefore, consequences were substantial enough for everyone to extract some observations and lessons. The starting point of it all was overregulation, which outlined the restrictions and conditions under which an SL would function. That included rigidity of the institutions to be flexible at a time economic conditions were changing and the financial sector was advancing. With fixed interest rates, it proved difficult for the SL to engage in competition as their means of investing was limited. One crucial point is that additional regulations do not necessarily mean fewer risks. SLs had to assume additional exposure to interest rate risk and alongside with banks, they were prevented from optimising their credit risk exposure. The government sometimes does not modify the regulations as fast as the structure of the industry is changing leading to new risks emerging and the cycle goes on. To keep up with advancement, the government has to put in place tighter risk management policies and controls. However, regulators and government should not direct the investment decisions of institutions. Rather, investments should be in line with market and economic forces. At a later stage, the industry was deregulated in order to remedy the situation. However, this translated into a decrease in market discipline as the SLs made high risk investments as they relied on the safety net of federal guarantee to cover any losses. Moral hazard, adverse selection and passive management were noted. Therefore, it exposes the disadvantage of FSLIC at that time which encouraged the SLs to take long-term and unreported risks. The deregulation, reducing the capital requirements, left the thrift industry more vulnerable to economic changes. From the failure of resuscitating the industry, it was deduced that forbearance treatment towards insolvent institutions might not always be the best option as it can lead to a freeze in lending and stifle the economy. One of the lessons from the thrift crisis which has been consistently taken into account over the years was the reliance on capital ratios. During the deregulation period of the crisis, there was no monitoring of the capital bases of the thrifts which ultimately lead to insolvency. From then on, institutions had to follow certain standard capital requirements put in place by regulators. However, this focus proved recently in the credit crunch to be detrimental, showing that banks favour trust and confidence. It is important to realise that capital ratios and other accounting ratios might not reveal the real economic strength of the institution. The crisis led to more disclosure and market value accounting. It has been understood that it would have been best to restrict involvement of public funds as a means of saving the industry. Using state or public funds to buy-out thrifts below value is not in accord with public welfare. A solution would have been to subdivide the thrifts into insured and uninsured ones with varying degrees of supervisory regulations concerning deposits and investments. One lesson learned was the emergence of an adjustable insurance premium rate which became a function of the institutionââ¬â¢s regulatory rating, risk and capital levels. CONCLUSION For some years the final bill for the SL crisis remained uncertain. However, it is known now that, the thrift crisis cost an extraordinary$153 billion ââ¬â one of the most expensive financial sector crises the world has seen. Of this, the US taxpayer paid out $124 billion while the thrift industry itself paid $29 million. The consequences of the SL crisis for the structure and regulation of the US financial industry were profound. The number of institutions in the SL industry fell by about half between 1986 and 1995, partly due to the closure of around 1,000 institutions by regulators, the most intense series of institution failures since the 1930s. The failures prompted an overhaul of the regulatory structure for US banking and thrifts, a shake-up in the system of deposit insurance and implied Government guarantees. Regulators shifted towards a policy of earlier intervention in failing institutions so that the principal costs are more likely to be borne by shareholders than other stakeholders. There was also a shift towards more risk-sensitive regulatory regimes, with respect to both net worth assessments and the payments to deposit insurance funds, while deposit insurance reform made it less likely that taxpayers would shoulder so great a burden in any future crisis. At a wider level, the SL crisis taught politicians, regulators and bankers how misleading rules-driven regulatory and accounting numbers can be in relation to risky bank activities. At different stages of the crisis, reporting of the financial condition of SLs was deliberately selected by interested parties to cover up the true economic extent of the unfolding disaster. It was a risk reporting failure on grand scale that greatly worsened the long term economic consequences fort the ultimate stakeholder: the US taxpayer. REFERENCES 1. Myth: Carter ruined the economy; Reagan saved it. http://www.huppi.com/kangaroo/L-carterreagan.htm [Accessed 31 October 2010 to 18 November 2010] 2. The U.S. banking debacle of the 1980ââ¬â¢s : A lesson in government mismanagement http://www.thefreemanonline.org/featured/the-us-banking-debacle-of-the-1980s-a-lesson-in-government-mismanagement/ [Accessed 31 October 2010 to 18 November 2010] 3. Inverted Yield Curve Research Report, Milken Institute http://www.milkeninstitute.org/pdf/InvrtdYieldCurvesRsrchRprt.pdf [Accessed 31 October 2010 to 18 November 2010 4. The Cost of the Savings and Loans Crisis, FDIC Banking Review http://useconomy.about.com/library/s-and-l-crisis.pdf [Accessed 31 October 2010 to 18 November 2010] 5. The SL Crisis: A Chrono-Bibliography, FDIC http://www.fdic.gov/bank/historical/s%26l/index.html [Accessed 31 October 2010 to 18 November 2010] 6. The Savings and Loan Crisis http://wapedia.mobi/en/Savings_and_loan_crisis.html [Accessed 31 October 2010 to 18 November 2010] 7. US Savings and Loans Crisis, Sungard Bancware Erisk http://www.prmia.org/pdf/Case_Studies/US_SL.pdf [Accessed 31 October 2010 to 18 November 2010] 8. Savings and Loans Crisis, FDIC Report Vol. 1 http://www.fdic.gov/bank/historical/history/167_188.pdf [Accessed 31 October 2010 to 18 November 2010] 9. The Economic Effects of the Savings and Loans Crisis, Congressional Budget Office http://www.cbo.gov/ftpdocs/100xx/doc10073/1992_01_theeconeffectsofthesavings.pdf [Accessed 31 October 2010 to 18 November 2010] 10. The Cost of Savings and Loans Crisis: Truth and Consequences, FDIC Banking Review http://fcx.fdic.gov/bank/analytical/banking/2000dec/brv13n2_2.pdf [Accessed 31 October 2010 to 18 November 2010]
Thursday, November 14, 2019
Free Essays: Comparing Characters and Themes in Hamlet and Macbeth :: comparison compare contrast essays
Parallel Characters and Themes in Hamlet and MacBeth à Throughout William Shakespeareââ¬â¢s plays Hamlet and Macbeth there are many similarities, along with many differences. These plays are both Shakespearean tragedies, which often use supernatural incidents to capture the readerââ¬â¢s interest, and consists of a hero that has a tragic flaw. There are many comparative and contrasting aspects in these plays. à à à à à à à à à à à The opening of Hamlet involves a supernatural, as does the opening of Macbeth. In the first scene the ghost of his father, King Hamlet, approaches Hamlet. Similarly, the opening of Macbeth involves the three witches. Although the witches can be seen by anyone they approach, the ghost of King Hamlet is only seen by Hamlet himself, and in one scene by Marcellus and Bernardo, Hamletââ¬â¢s servants. Similarly in both plays, the main characters are slightly suspicious of the actual powers these supernatural figures have. As the witches use their apparent powers to tell Macbeth the future, the ghost of King Hamlet tells Hamlet what has happened already. Hamlet states in one of his soliloquies ââ¬Å"The spirit that I have seen / may be the devilâ⬠(2.2.598-599). Macbeth also has his doubts because when the witches tell him that he will be named Thane of Cawder, Macbeth himself had not known, but many people had. It is possible the witches could have known. In the same matter in both plays, the presentation of the supernatural began to lead to the final downfall of each of the characters. In Macbeth, the three witches cause him to think and do evil deeds. In Hamlet, if he had not seen the ghost of his father, he would not have known that Claudius has killed his father to claim the throne. In both instances the characters gave into the nagging supernatural beliefs. And hence they lost their lives. Other characters in these plays show parallels in their plots. Both plays have a main character that portrays the king of that country. In Hamlet, the King of Denmark, Claudius is directly related to Hamlet. He is his uncle, and also his motherââ¬â¢s new husband. However, in Macbeth the King of Scotland, King Duncan, is not directly related to the main character.
Subscribe to:
Posts (Atom)