Saturday, October 5, 2019
Analyze the BP Oil Spell Case Essay Example | Topics and Well Written Essays - 1250 words
Analyze the BP Oil Spell Case - Essay Example Transparency From a transparency point of view, the response of BP was insufficient because it did not accord the relevant authorities, the information that would help them counter the problem effectively. The slow nature of the BP Company, in giving response to the spill indicated inadequacies in transparency, which was among the causes that made the spill continue for three months, without being effectively addressed. According to CEG, the BP Company hindered the response offered to the oil spin, which was likely to affect the management of further oil spills (1). According to the CEG, the response to the spill was affected by the restrictions imposed on media access to the site, the delay of the disclosure of information from the company ââ¬â regarding dispersants, and the overall lack of cooperation by the company and the government agencies responsible (1). Further, the BP Company and respective government agencies were very slow in releasing information to the public, regar ding the extent of damage caused, the effects of the spill and the level of transparency offered in the case, by the parties responsible. Lack of transparencies was evident from the reports offered by the BP Company and the company, regarding the volumes spilled on a daily basis. The company and the government reported that the spill was releasing 1,000 barrels a day, but the reports were discredited later after it was estimated by a specialist agency, that the spill was releasing between 11,000 and 25,000 barrels each day (CEG 1). Lack of transparencies was evident from BPââ¬â¢s delays, in providing high-definition video footage, which would facilitate computer analysis. BP wrote, on the 21st of May 2010, most likely after realizing that its lack of transparency was affecting the deployment of corrective measures to address the oil spill. In the account, the company noted that it would offer transparency and openness about the disaster, and its cooperation with organizations to respond to the oil spill. Rationality From a rationality perspective, the BP oil spill exposed a lack of disaster mitigation preparedness and carelessness on the part of the BP Company and the agencies that were supposed to respond to the issue. These agencies include the EPA (Environmental Protection Agency) and the DHS (Department of Homeland Security). The oil spill exposed the careless of the company because ââ¬â knowing well, the impacts of leaving the spill open ââ¬â they left it open, for more than three months. From a rational point of view, it is clear that BP, as well as the government, did not engage the resources they required to mitigate the spill, in the short time they could (Walsh 1). Among the reasons cited as causes for the explosion, include that the BP Company had failed to administer effective risk management, including that they did not inspect the facility, prior to the time of the explosion. Therefore, the nature and the extent of the disaster display ed the companyââ¬â¢s ineffective risk mitigation and non-preparedness to address disasters (Walsh 1). Avoiding Extremes From a point of view of avoiding extremes, the BP oil spill was evidently a demonstration of disregard for the extreme effects of disaster. This is evident from the fact that, after the explosion of April 20th the company and
Friday, October 4, 2019
After the Census of 2000, how have reapportionment affected the State Research Paper
After the Census of 2000, how have reapportionment affected the State of Nevada - Research Paper Example Nevada is one of the fifty states of the United States, situated in the west part of the United States Being one of the US states, Nevada participates in the apportionment in the United States, which involves a process of dividing 435 seats in the House of Representatives among the fifty states in the United States. According to the 2000 census report, the Nevada state had a total population of 2,002,032 0f which 1,998,267 was resident population and 3,775 included the united states population overseas for those whose hometown was Nevada. The article further states that the main aim of apportionment is to evenly distribute the congress seats among the fifty states. According to census determines the number of representatives that a state has in the United States House of Representatives. Those states with large populations are allocated more representatives compared to the low populated states. How reapportionment has affected the State of Nevada after the 2000 census Reapportionment is the process of allocation of seats among the fifty states in the United States based on the previous census. This allocation of the seats is determined by a stateââ¬â¢s population. Reapportionment takes place after two years since the last census; hence, after the 2000 census, reapportionment was done in 2003. . this apportionment population includes the adults, children, United States citizens, and immigrants. Importantly, Nevada received additional one representative for its state; however, each of the fifty states is usually entitled to one representative depending on the population of a state. A census in the United States is conducted after every ten years. ... When the census is done, the results are reported to the president by the end of that year, thus the results of the census are used to allocate the congress seats to all the states. There are four different types of formulas that are used when apportioning seats to the states. One of the formulas is the method of greatest divisors; this method divides the total population by the number of seats assigned to each state, such that every state is given the exact number of seats that it deserves. The second formula is the method of major fractions that was invented by Daniel Webster. It was used in the 1840ââ¬â¢s, in which it considered adding a seat to a state that had a fraction of a half and above. Alexander proposed a third formula in the 1850ââ¬â¢s that ensured that members were allocated to each state depending on the stateââ¬â¢s population, while the remaining seats were allocated to the highly populated states. In 1930s, the formula of equal proportion arose in which it u ses the stateââ¬â¢s population and divides it by the geometric mean of a stateââ¬â¢s present number of seats and the next seat. According to Rourke (1980, pp 7), reapportionment is viewed as a converter from the rural pattern to an urban dominance. He also states that reapportionment is deemed to produce changes in states, even if the change is minimal. During reapportionment, every state is always expected to gain at least one seat; however, apportionment usually affects the distribution of votes such that those states that loose any seat lose a particular number of electoral votes. After a census, the population growth results assist in reapportioning, which has an impact on the number of votes that a state accumulates in presidential elections. According to Salam (2010), some of the states like Nevada may
Thursday, October 3, 2019
Periodic Properties Essay Example for Free
Periodic Properties Essay The halogens F, Cl, Br and I (At has not been included because of its scarcity and nuclear instability) are very reactive non-metals that occur in the penultimate group of the periodic table, hence they all require just one electron to complete their valence shell. All of the elements exists as diatomic molecules (F2, Cl2, Br2, I2) in which the atoms are joined by single covalent bonds. Going down a group of the periodic table, for successive elements there are more energy levels filled with electrons, so the outer electors are in higher energy levels and farther from the nucleus. Fluorine and chlorine are gases, bromine a liquid and iodine a solid that forms a purple vapour on heating. The halogens are all quite electronegative elements. They require just one electron to complete their valence shell, hence they readily gain electrons to form the singly charged halide ions (Fà ¯Ã ¿Ã ½,Clà ¯Ã ¿Ã ½,Brà ¯Ã ¿Ã ½,Ià ¯Ã ¿Ã ½). The ease with which they gain electrons gained is further from the nucleus and hence less strongly attracted. This means that, in contrast to the alkali metals, the reactivity of the halogens decreases going down the group. Method 1) Test the solubility of Iodine: 1. A very small amount of iodine was put into water, cyclohexane and KI(aq) respectively 2. The color changes of the solutions and the solubility in each solvent were recorded 2) Test iodine reacts with starch: 1. Three drops of I2-KI solution were put into a test tube 2. A few drops of starch solution were added after that 3. The color of solution was recorded 3) Test the acid-base properties: 1. A few drops of chlorine water were put in a test surface, and it was tested with universal indicator paper 2. This was repeated first using water and then using iodine solution instead of the chlorine water 3. The color changes were recorded 4) Displacements between halogen elements: 1. 2cm depth of each aqueous solution: sodium chloride, potassium bromide and potassium iodide were put into 3 respective test tubes and labeled 2. An equal volume of chlorine water was added into each test tube and the results were recorded 3. A little hexane was added to form a separate upper layer of a non-polar solvent 4. The mixtures were shook and the changes were recorded 5. Step 1, 2, 3 and 4 were repeated first using water and then iodine solution instead of chlorine water 5) Tests for halide ions [Halide ions (Cl-, Br- and I-) with silver ions]: 1. About 1cm depth of aqueous sodium chloride was put into a test tube 2. A little aqueous silver nitrate was added and then the observations were recorded 3. The test tube was placed in a sunny place, and left there for about 5 minutes and then it was observed again 4. Step 1, 2 and 3 were repeated using aqueous potassium bromide, then aqueous potassium iodide instead of sodium chloride ?Data Collection? 1) The solubility of iodine in different solvent Color Solubility Water Colorless Insoluble Cyclohexane Purple Soluble Ethanol Yellow Soluble KI(aq) Yellow-brown Soluble 2) Test iodine reacts with starch: The color of the solution is black. 3) Test the acid-base properties: Cl2 Br2 I2-KI pH value 4 3 12 4) Displacements between halogen elements: The color change of the solution after Cl2, Br2, I2 added into NaCl, KBr and KI respectively Cl2 Br2 I2 NaCl No change No change Brown KBr Pale yellow solution No change Brown KI yellow yellow Brown The color of the upper layers of the solution after hexane added Cl2 Br2 I2 NaCl No change No change Purple red KBr Pale purple No change Purple red KI purple Pale purple Purple red 5) Tests for halide ions: Halide ions (Cl-, Br- and I-) with silver ions: NaCl White precipitate is produced Darkens after it was placed in sunlight KBr Cream precipitate is produced. KI Yellow precipitate is produced. ?Data Analysis? 1) The solubility of iodine in different solvents: The solubility is larger in non-polar solvent (water, ethanol) and smaller in polar solvents.(cylohexane and KI) The purple color of iodine in cyclohexane is that because in non-polar solvents, iodine froms the violet solution. 2) Test iodine reacts with starch: According to the general knowledge we knew, the phenomenon of this reaction should be blue, but the color observed was black-green. That was because some of the starch hydrolysis in water and produced something could make the color darker. 3) Test the acid-base properties: 1. Cl2: The color of the universal indicator papers showed that Cl2 is strong acid. 2. Br2: The color of the universal indicator papers showed that Br2 is a kind of acid, but not very strong. 3. I2: The color of the universal indicator papers showed that I2 is a strong base. Actually, I2 is acid. The reason is that the original color of I2 is red-brown, that made us cant see the phenomenon clearly. 4) Displacements between halogen elements: As what I mentioned above in background, the rule of displacements between halogen elements is that more reactive ones displace less reactive ones. Thats the reason why Br -cant displace Cl -, and I -cant displace Br and Cl-. When there was no reaction between two elements, the color we observed was the blend of original colors of the less reactive element and the solution containing the more reactive element. If theres a reaction between two elements, the color we can observe is the color of the displaced element. According to the information we got from Internet, we knew hexane is a kind of oil and is insoluble in water-solvent. That was the reason why we could differentiate the two layers of each solution very clearly. The colors of each solutions under layer were the original colors of the saline solutions. There were two kinds of instances of the color of upper layer of each solution. For the solutions those do not have I ion, they were colorless. Thats because hexane is colorless and cannot react with Cl or Br -. Another instance is that the solutions include I -, when I meets hexane, it will show the color of itself. That was why we could observe color of purple in this experiment. 5) Test for halide ions: When halide ions dissolved into silver salts, then the precipitate is appear commonly. The white precipitate is AgCl: AgNO3+NaClà ¯Ã ¿Ã ½AgCl+NaNO3 The off-white precipitate is AgBr: AgNO3+KBrà ¯Ã ¿Ã ½AgBr+NaNO3 The pale yellow precipitate is AgI: AgNO3+NaIà ¯Ã ¿Ã ½AgI+NaNO3 After 10 minutes under the sunshine, photodissociation happened on all of them, so the black precipitate on the bottoms of three test tubes are the products of photodissociation. 1. Going down the group, the elements of this group have the same effective nuclear charge. Atomic radius of these elements becomes bigger because of the increase of the number of energy levels. The attraction between nucleus and valence electrons gets weaker. Less energy is required to remove the first electron from one mole of gaseous atoms. The ionization energy going down the group decreases. The ability to attract electrons becomes weaker. The electronegativity going down the group decreases. 2. Organic solvents always contain the element carbon. Inorganic solvents dont contain the element carbon. The most common solvent, water, is an example of an inorganic solvent. There are many more organic solvents than inorganic solvents. Compare with organic and inorganic solvent, the solubility of iodine is higher in organic solvent. 3. The oxidizing power of the halogens decrease going down the group as the size of the atoms increase going down the group as the size of the atoms increases and the attraction between the nucleus and the electrons becomes less. In that case, going down the group, the elements become less powerful oxidising agents. This means that a higher halogen will displace a lower halogen from its salts. A lower halogen cannot displace a higher halogen from its salts. 4. When starch reacts with iodine, the typical blue black color will appear. Thats a good way for us to identify starch and iodine. 5. After photodissociation, the color of some precipitates will change. will become black. Thats the most obvious one. Other precipitates will become darken. 1. Because we use solid iodine in the first experiment. If we add the solvent into the test tube first, the test tube will be wet and the solid iodine we put in later will attach on the surface inside instead of fall into the liquid. For this reason we must add solid iodine first in experiment 1. 2. According to the first experiment, we found that the solubility of iodine in pure water is very low. But the solubility of iodine in potassium iodide solution is relatively much higher. So we use I2-KI solution to increase the amount of iodine in order to let the phenomenon more obvious. REFERENCE 1) à ¯Ã ¿Ã ½Chemistryà ¯Ã ¿Ã ½(for use with the International Baccalaureate Diploma Programme) [3rd Edition] John Green Sadru Damji First published in 2007 by IBID Press, Victoria, Page 77 to 78. 2) http://www.epa.gov/ttn/atw/hlthef/hexane.html 3) http://baike.baidu.com/view/373611.htm 4) http://baike.baidu.com/view/908645.htm
The Reinsurance Expected Loss Cost Formula
The Reinsurance Expected Loss Cost Formula ELCF is the excess loss cost factor (as a percentage of total lost cost). PCP is the primary company/subject premium. PCPLR is the primary company permissible loss ratio (including any loss adjustment expenses covered as a part of loss) RCF is the rate correction factor which is the reinsurers adjustment for the estimated adequacy or inadequacy of the primary rate Given that the coverage of this treaty is per-occurrence, we must also weigh the manual difference rate for the clash exposure. In order to determine the reinsurers excess share the ALAE is added to each claim, and therefore claims from policy limits which are below the attachment point will be introduced into the excess layer. The reinsure may have own data that describe the bi-variate distribution of indemnity and ALAE, or such information can be obtained from ISO or similar organization outside of United States of America. With these data the reinsurer is able to construct the increased limits tables with ALAE added to the loss instead of residing in its entirety in the basic limits coverage. Another more simple alternative is to adjust the manual increased limits factors so that they to account for the addition of the ALAE to the loss. A basic way of doing this is to use the assumption that the ALAE for each and every claim is a deterministic function of indemnity amount for the claim, which means adding exactly ÃŽà ³% to each claim value for the range of claim sizes that are near the layer of interest. This ÃŽà ³ factor is smaller than the overall ratio of ALAE to ground-up indemnity loss, as much of the total ALAE relates to small claims or claims closed with no indemnity. Assumption: when ALAE is added to loss, every claim with indemnity greater than $300,000 = (1+ ÃŽà ³) enters the layer $1,400,000 excess of $600,000, and that the loss amount in the layer reaches $1,400,000 when the ground-up indemnity reaches $2,000,000 = (1+ ÃŽà ³). From this the standard increased limits factors can be modified to account for ALAE added to the loss. In this liability context, Formula for RELC can be used with PCP as the basic limit premium and PCPLR can be used as the primary company permissible basic limits loss ratio. Assumption: Given the clash exposure an overall loss loading of ÃŽà ´% is sufficient enough to adjust the loss cost for this layer predicted from the stand-alone policies. Then ELCF determines the excess loss in the layer $1,400,000 with excess of $600,000 which arises from each policy limit and plus its contribution to the clash losses as a percentage of the basic limits loss that arise from the same policy limit. The formula for ELCF which is evaluated at limit (Lim) is as follows: Formula : Liability ELCF for ALAE Added to Indemnity Loss ELCF(Lim) = 0 Where Attachment Point AP = $600,000 Reinsurance Limit RLim = $1,400,000 Clash loading ÃŽà ´ = 5% Excess ALAE loading ÃŽà ³ = 20% The table 2 displays this method for a part of Allstates exposure using the hypothetical increased limits factors to calculate the excess loss cost factors with both ALAE and risk load excluded. Table 2: Excess Loss Cost Factors with ALAE Added to Indemnity Loss at 20% add-on and a Clash Loading of 5% Table : Excess Loss Cost Factors with ALAE Added to Indemnity Loss at 20% add-on and a Clash Loading of 5% (1) Policy Limit in $ (2) ILF w/o risk load and w/o ALAE (3) ELCF 200,000 1.0000 0 500,000 1.2486 0 600,000 1.2942 0.0575 1,000,000 1.4094 0.2026 1,666,666 1.5273 0.3512 2,000,000 or more 1.5687 0.4033 Source: own calculation based on Patrik (2001) Using the Formula 4., the ELCF($600,000) = 1.20*1.05*(1.2942-1.2486) = 0.0575, and ELCF($2,000,000) =1.20*1.05*(1.5687-1.2486) = 0.4033. Assumption1: for this exposure the Allstates permissible basic limit loss ratio is PCPLR = 70%. Assumption2: reinsurers evaluation indicates that the cedants rates and offsets are sufficient and therefore RCF is 1.00. The reinsurer can now calculate the exposure rate RELC and the reinsurers undiscounted estimate of loss cost in the excess layer as can be seen in the table 3. Table 3: Reinsurance Expected Loss Cost (undiscounted) Table : Reinsurance Expected Loss Cost (undiscounted) (1) Policy Limit in $ (2) Estimated Subject Premium Year 2009 in $ (3) Manual ILF (4) Estimated Basic Limit Loss Cost 0.70x(2)/(3) (5) ELCF (6) RELC in $ (4)x(5) Below 600,000 2,000,000 1.10 (avg.) 1272727.27 0 0 600,000 2,000,000 1.35 1,037,037.04 0.0575 59,629.63 1,000,000 2,000,000 1.50 933,333.33 0.2026 189,093.33 2,000,000 or more 4,000,000 1.75 (avg.) 1,600,000.00 0.3512 562,920.00 Total 10,000,000 n.a. 4,843,197.64 n.a. 811,642.96 Source: own calculation based on Patrik (2001) An exposure loss cost can be estimated using probability models of the claim size distributions. This directly gives the reinsurer the claim count and the claim severity information which the reinsurer can use in the simple risk theoretic model for the aggregate loss. Assumption: the indemnity loss distribution underlying Table 2 is Pareto with q =1.1 and b =5,000. Then the simple model of adding the 20% ALAE to the indemnity per-occurrence changes the indemnity of a Pareto distribution to a new Pareto with q =1.1and b=5,000*1.20 = 6,000. The reinsurer has to adjust the layer severity for a clash and this can be done by multiplying with 1+ÃŽà ´ =1.05. The reinsurer can therefore calculate from each policy limit the excess expected claim sizes, after dividing the expected claim size by the RELC for each limit the reinsurer obtains the estimates of expected claim count. This is done in Table 4. The expected claim size can be calculated as follows: Firstly the expected excess claim severity over the attachment point d and subject to the reinsurance limit RLim for a policy limit ÃŽà » can has to be calculated. This can be done as follows: For ÃŽà »= 600,000 For ÃŽà »=1,000,000 For ÃŽà »=2,000,000 The reinsurer is now able to calculate the expected claim count, the estimation can be seen in the table 4: Table 4: Excess Expected Loss, Claim Severity and Claim Count Table : Excess Expected Loss, Claim Severity and Claim Count Policy Count in $ (2) RELC in $ (3) Expected Claim Size in $ (4) Expected Claim Count (2)/(3) 600,000 59,629.63 113,928 0.523 1,000,000 189,093.33 423,164 0.447 2,000,000 or more 562,920.00 819,557 0.687 Total 811,642.96 1,356,649 1.68 Source: own calculation based on Patrik (2001) The total excess expected claim size for this exposure is $1,356,649. If the independence of claim events across all of the exposures can be assumed, the reinsurer can also obtain total estimates of the overall excess expected occurrence (claim) size and the expected occurrence (claim) count. Now we are going to estimate the experience rating. Step 3: Gather and reconcile primary claims data segregated by major rating class groups. As in the Example of property quota share treaties, the reinsurer needs the claims data separated as the exposure data, and the reinsurer also wants some history of the individual large claims. The reinsurer usually receives information on all claims which are greater than one-half of the proposed attachment point, but it is important to receive as much data as possible. Assumption: a claims review has been performed and the reinsurer received a detailed history for each known claim larger than $100,000 occurring 2000-2010, which were evaluated 12/31/00, 12/31/01à ¢Ã¢â ¬Ã ¦, 12/31/09, and 6/30/10. Step 4: Filter the major catastrophic claims out of the claims data. The reinsurer wants to identify clash claims and the mass tort claims which are significant. By separating out the clash claims, the reinsurer can estimate their size and their frequency and how they relate to the non-clash claims. These statistics should be compared to the values that the reinsurer knows from other cedants and therefore is able to get a better approximation for the ÃŽà ´ loading. Step 5: Trend the claims data to the rating period. As with the example for the property-quota share treaties, the trending should be for the inflation and also for other changes in the exposure (e.g. higher policy limits) which may affect the loss potential, but unlike with the proportional coverage, this step cannot be skipped. The reason for this is the leveraged effect which has the inflation upon the excess claims. The constant inflation rate increases the aggregate loss beyond any attachment point and it increases faster than the aggregate loss below, as the claims grow into the excess layer, whereas their value below is stopped at the attachment point. Each ground-up claim value is trended at each evaluation, including ALAE, from year of occurrence to 2011. For example, consider the treatment of a 2003 claim in the table 5. Table 5: Trending an Accident Year 2003 Claim Table : Trending an Accident Year 2003 Claim (1) Evaluation Date (2) Value at Evaluation In $ (3) Trend factor (4) 2011 Level Value in 4 (5) Excess Amount in$ 12/31/03 0 1.62 0 0 12/31/04 0 1.62 0 0 12/31/05 250,000 1.62 405,000 0 12/31/06 250,000 1.62 405,000 0 12/31/07 300,000 1.62 486,000 0 12/31/08 400,000 1.62 648,000 48,000 12/31/09 400,000 1.62 648,000 48,000 06/30/10 400,000 1.62 648,000 48,000 Source: own calculation based on Patrik (2001) The reasoning for a single trend factor in this example is that the trend affects the claim values according to the accident date and not by an evaluation date. The trending of the policy limits is a delicate issue, because if a 2003 claim on a policy which has limit that is less than $500,000 inflates to above $600,000 ( plus ALAE), will be the policy limit that will be sold in the year 2011 greater than $500,000? It seems that over long periods of time, that the policy limits change with inflation. Therefore the reinsurer should over time, if possible, receive information on the Allstates policy limit distributions. Step 6: Develop the claims data to settlement values. The next step is to construct the historical accident year, thus we want to develop the year triangles for each type of a large claim from the data which was produced in column (5) of Table 5. Typically all claims should be combined together by major line of business. Afterwards the loss development factors should be estimated and applied on the excess claims data while using the standard methods. Also in order to check for reasonableness and comparable coverages we want to compare the development patterns that were estimated from Allstates data to our own expectations which have their basis in our own historical data. When considering the claim in Table 5 we see that only $48,000 is over the attachment point, and also only at the fifth development point Table 6: Trended Historical Claims in the Layer $1,400,000 Excess of $600,000 (in $1,000s) Table : Trended Historical Claims in the Layer $1,400,000 Excess of $600,000 (in $1,000s) Assumption: our triangle looks like the Table 6: Acc. Year Age 1 in $ Age 2 in $ Age 3 in $ à ¢Ã¢â ¬Ã ¦ Age 9 in $ Age 10 in $ Age 10.5 in $ 2000 0 90 264 à ¢Ã¢â ¬Ã ¦ 259 351 351 2001 0 0 154 à ¢Ã¢â ¬Ã ¦ 763 798 à ¢Ã¢â ¬Ã ¦ à ¢Ã¢â ¬Ã ¦ à ¢Ã¢â ¬Ã ¦ à ¢Ã¢â ¬Ã ¦ à ¢Ã¢â ¬Ã ¦ à ¢Ã¢â ¬Ã ¦ à ¢Ã¢â ¬Ã ¦ à ¢Ã¢â ¬Ã ¦ 2008 77 117 256 2009 0 0 2010 0 ATA 4.336 1.573 1.166 à ¢Ã¢â ¬Ã ¦ 1,349 n.a. n.a. ATU 15.036 3.547 2.345 à ¢Ã¢â ¬Ã ¦ 1.401 1.050 = tail Smoothed Lags 11.9% 28.7% 47.7% à ¢Ã¢â ¬Ã ¦ 93.1% 95.3% 96.7% Source: own calculation based on Patrik (2001) Where: ATA is Age-To-Age development factor ATU is Age-To-Ultimate development factor Lag(t) is the percentage of loss reported at time t The selection of the tail factor of 1.05 is based upon the general information about the development for this type of an exposure beyond ten years. By changing to the inverse for the point of view from the age-to-ultimate factors, the time lags of the claim dollar reporting, the loss reporting view is transformed to that of the cumulative distribution function (CDF) whose domain is [0,), this transformation gives a better outlook of the loss development pattern. It also allows considering and measuring the average (expected) lag and some other moments, that are comparable to the moments of loss development patterns from other exposures. Given the chaotic development of excess claims, it is a important to employ smoothing technique. If the smoothed factors are correctly estimated they should more credible loss development estimates which are more credible. They also allow to evaluate the function Lag( ) at every positive time. The smoothing which was introduced in the last row of Table 6 is based on a Gamma distribution with a mean of 4 (years) and a standard deviation of 3. It is also usually useful to analyze the large claim paid data, if possible, both to estimate the patterns of the excess claims payment and also to supplement the ultimate estimates which are based only on the reported claims that were used above. Sometimes the only data available are the data on aggregate excess claims, which would be the historical accident year per development year $1,400,000 excess of $600,000 aggregate loss triangle. Pricing without specific information about the large claims in such a situation, is very risky, but it is occasionally done. Step 7: Estimate the catastrophic loss potential. The mass tort claims such as pollution clean-up claims distort the historical data and therefore need special treatment. As with the property coverage, the analysis of Allstates exposures may allow us to predict some suitable loading for the future mass tort claim potential. As was said in the Step 4, the reinsurer needs to identify the clash claims. With the separation of the clash claims, for each claim, the various parts are then added together to be applied to the occurrence loss amount at the attachment point and at the reinsurance limit. If it is not possible to identify the clash claims, then the estimation of the experience of RELC has to include a clash loading which is based on judgment of the general type of exposure. Step 8: Adjust the historical exposures to the rating period. As in the example on the property quota-share treaties the historical exposure (premium) data has to be adjusted in such a manner that makes the data are reasonably relevant to the rating period, therefore the trending should be for the primary rate, for the underwriting changes and also for other changes in exposure that may affect the loss potential of the treaty.. Step 9: Estimate an experience expected loss cost, PVRELC, and, if desirable, a loss cost rate, PVRELC/PCP. Assumption: we have trended and developed excess losses for all classes of Allstates casualty exposure. The standard practice is to add the pieces up as seen in the table 7. Table 7: Allstate Insurance Company Casualty Business Table : Allstate Insurance Company Casualty Business (1) Accident Year (2) Onlevel PCP in $ (3) Trended and Developed Loss and Excess Loss (estimated RELC) in $ (4) Estimated Cost Rate in % (3)/(2) 2002 171,694 6,714 3.91 2003 175,906 9,288 5.28 2004 178,152 13,522 7.59 2005 185.894 10,820 5.82 2006 188,344 9,134 4.58 2007 191,348 6,658 3.48 2008 197122 8,536 4.33 2009 198,452 12,840 6.47 2010 99,500 2,826 2.84 Total 1,586,412 80,336 5.06 Total w/o 2010 1,486,912 77,510 5.21 Source: own calculation based on Patrik (2001) The average loss cost rate for eight years is 5.21%, where the data from the year 2010 was eliminated as it is too green (undeveloped) and there does not seem to be a particular trend from year to year. Table 7 gives us the experience-based estimate, RELC=PCP =5.21%, but this estimate has to be loaded for the existing mass tort exposure, and also for the clash claims if we had insufficient information on the clash claims in the claims data. Step 10: Estimate a credibility loss cost or loss cost rate from the exposure and experience loss costs or loss cost rates The experience loss cost rate has to be weighed against the exposure loss cost rate that we already calculated. If there is more than one answer with different various answers that cannot be further reconciled, the final answers for the $1.400, 000 excess of $600,000 claim count and for the severity may be based on the credibility balancing of these separate estimates. All the differences should however not be ignored, but should be included in the estimates of the parameter (and model) uncertainty, and therefore providing a rise to a more realistic measures of the variances, etc., and of the risk. Assumption: simple situation, where there are weighed together only the experience loss cost estimate and the exposure loss cost estimate. The six considerations for deciding on how much weight should be given to the exposure loss cost estimate are: The accuracy of the estimate of RCF, the primary rate correction factor, and thus the accuracy of the primary expected loss cost or loss ratio The accuracy of the predicted distribution of subject premium by line of business For excess coverage, the accuracy of the predicted distribution of subject premium by increased limits table for liability, by state for workers compensation, or by type of insured for property, within a line of business For excess coverage, the accuracy of the predicted distribution of subject premium by policy limit within increased limits table for liability, by hazard group for workers compensation, by amount insured for property For excess coverage, the accuracy of the excess loss cost factors for coverage above the attachment point For excess coverage, the degree of potential exposure not contemplated by the excess loss cost factors The credibility of the exposure loss cost estimation decreases if there are problems with any of these six items listed. Also the six considerations from which can be decided how much weight can be given to the experience loss cost estimate are: The accuracy of the estimates of claims cost inflation The accuracy of the estimates of loss development The accuracy of the subject premium on-level factors The stability of the loss cost, or loss cost rate, over time The possibility of changes in the underlying exposure over time For excess coverage, the possibility of changes in the distribution of policy limits over time The credibility of the experience loss cost estimate lessens with problems with any of the six items. Assumption: the credibility loss cost rate is RELC/PCP = 5.75%. For each of the exposure category a loss discount factor is estimated, which is based on the expected loss payment pattern for the exposure in the layer $1,400,000 excess of $600,000, and on a chosen investment yield. Most actuaries support the use of a risk-free yield, such as U.S. Treasuries for U.S. business, for the approximation of the maturity of the average claim payment lag. Discounting is significant only for longer tail business. On a practical base for a bond maturity which is between five to ten years it is better to use a single, constant fixed rate. Assumption: the overall discount factor for the loss cost rate of 5.75% is RDF= 75%, which gives PVRELC/PCP = RDF*RELC/PCP =0.75*5.75%= 4.31%, or PVRELC= 4.31% * $200,000,000 = $8,620,000. The steps 11 and 12 with this example are reversed. Step 12: Specify values for RCR, RIXL, and RTER Assumption: the standard guidelines for this size and type of a contract and this type of an exposure specify RIXL = 5% and RTER = 15%. The reinsurance pure premium RPP can be calculated as RPP = PVRLC/(1-RTER) = $8,620,000/0.85 = $10,141,176 with an expected profit as RPP PVRELC = $10,141,176 $8,620,000 = $1,521,176 for the risk transfer. As the RCR = 0% we can calculate the technical reinsurance premium of RP = RPP/(1-RIXL) = $10,141,176 /0.95 = $10,674,922. This technical premium is therefore above the maximum of $10,000,000 which was specified by the Allstate Insurance Company. If there is nothing wrong with technical calculations, then the reinsurer has two options. The first one is to accept the expected reinsurance premium of $10,000,000 at a rate of 5%, with the expected profit reduced to $10,000,000 $8,620,000 = $1,380,000 Or secondly the reinsurer can propose a variable rate contract, with the reinsurance rate varying due to the reinsurance loss experience, which in this case is a retrospectively rated contract. As the Allstate Insurance Company is asking for a retrospectively rated contract we select the second possibility. To construct a fair and balanced rating plan, the distribution of the reinsurance of an aggregate loss has to be estimated. Now we proceed with step 11. Step 11: Estimate the probability distribution of the aggregate reinsurance loss if desirable, and perhaps other distributions such as for claims payment timing. In this step the Gamma distribution approximation will be used. As our example is lower (excess) claim frequency situation, the standard risk theoretic model for aggregate losses will be used together with the first two moments of the claim count and the claim severity distributions to approximate the distribution of aggregate reinsurance loss. The aggregate loss in the standard model is written as the sum of the individual claims, as follows. Formula : Aggregate Loss L=X1 + X2 +à ¢Ã¢â ¬Ã ¦+ XN with L as a random variable (rv) for aggregate loss N as a rv for number of claims (events, occurrences) Xi as rv for the dollar size of the ith claim The N and Xi are referring to the amount of the ith claim and to the excess number of claims. To see how the standard risk theoretic model relates to the distributions of L, N and the Xis see Patrik (2001). We are working with the assumption that the Xis are both identically and independently distributed and also independent of N, further we assume that the kth moment of L is determined completely by the first k moments of N and the Xis. There is following relationships. Formula : First Two Central Moments of the Distribution of Aggregate Loss under the Standard Risk Theoretic Model E[L] = E[N] x E[X] Var[L] = E[N] x E[X2] + (Var[N] E[N]) x E[X]2 Assumption: the E[L] = RELC =5.75%*$200,000,000 = $11,500,000 (undiscounted). We assume simplistically independent and identical distribution of the excess claim sizes and also the independency of the excess claim (occurrence) count. Usually this is a reasonable assumption. For our layer $1,400,000 excess of $600,000, our modeling assumptions and results are shown in the formula below. Formula : Allstate $1,400,000 Excess of $600,000 Aggregate Loss Modeling Assumptions and Results
Wednesday, October 2, 2019
Government is Best which Governs Least :: essays papers
Government is Best which Governs Least I heartily accept the motto, "That government is best which governs least"; and I should like to see it acted up to more rapidly and systematically. Carried out, it finally amounts to this, which also I believe--"That government is best which governs not at all"; and when men are prepared for it, that will be the kind of government which the will have. Government is at best but an expedient; but most governments are usually, and all governments are sometimes, inexpedient. The objections which have been brought against a standing army, and they are many and weighty, and deserve to prevail, may also at last be brought against a standing government. The standing army is only an arm of the standing government. The government itself, which is only the mode which the people have chosen to execute their will, is equally liable to be abused and perverted before the people can act through it. Witness the present Mexican war, the work of comparatively a few individuals using the standing government as their tool; for in the outset, the people would not have consented to this measure. This American government--what is it but a tradition, though a recent one, endeavoring to transmit itself unimpaired to posterity, but each instant losing some of its integrity? It has not the vitality and force of a single living man; for a single man can bend it to his will. It is a sort of wooden gun to the people themselves. But it is not the less necessary for this; for the people must have some complicated machinery or other, and hear its din, to satisfy that idea of government which they have. Governments show thus how successfully men can be imposed upon, even impose on themselves, for their own advantage. It is excellent, we must all allow. Yet this government never of itself furthered any enterprise, but by the alacrity with which it got out of its way. It does not keep the country free. It does not settle the West. It does not educate. The character inherent in the American people has done all that has been accomplished; and it would have done somewhat more, if the government had not sometimes got in its way. For government is an expedient, by which men would fain succeed in letting one another alone; and, as has been said, when it is most expedient, the governed are most let alone by it.
Tuesday, October 1, 2019
Physics of Incandescent Light bulbs :: physics light bulb
The incandescent light bulb, since its fairly recent invention, has quickly become a basic essential of modern technological life as we know it. It took many years to create an practical bulb despite the simplicity of its structure. I believe a majority of us take them completely for granted as a normal part of life. Early man knew the sun as his lightsource and when the sun set, he knew the moon and the stars. As his intelligence increased and he learned about the world in which he lived he became associated with fire. Fire could be used for warmth, cooking, protection, and light. Man lived with this for years, elaborating and improving the way the fire was created and burned for light, until the year of 1809 when one man, an English chemist by the name of Humphrey Davy began the search for a usable incandescent light source using electricity. Using a high powered battery to induce a current between two high powered strips he produced an intense incandescent light, which became known as the first arc lamp. Although it was a first step it was not yet a practical light source. The first known attempt to make a actual bulb didn't come until 1920 when Warren De la Rue enclosed a coil of platinum wire in an evacuated tube and passed an electrical current through it. Although a platinum light bulb was not practical the idea behind his design was. A metal with a high melting point to achieve high temperature and thus bright light, as well as an evacuated tube that contained less particles to react with the metal and thus an elongated bulb life. Throughout the next few decades scientists labored to create their "efficient" light bulb. Their main hurdle was finding a low cost, long lived, high temperature filament material that would glow with high intensity. In 1879 two scientist, Joseph Wilson Swan and Thomas A Edison, had independent breakthroughs for a longer lasting incandecent bulb with their use of a carbon fiber filament derived from cotton. It lasted a maximum of 13.5 hours. In 1880 Edison also developed a filament derived from bamboo which lasted up to 1200 hours. This was good, but to create a truly efficient bulb something different was need to creae a filament with very high temperatures but without degeneration and loss of heat. Many elements were experimented with, a few of the most popular which were carbon, osmium, and tantalum.
Implementation Plan Research Essay
Founded in 1899, Harrison Keyes has been a leader in publishing business, scientific and technical information. Due to recent changes in the industry, and an attempt to revamp its former successes, the company is in the process of redefining itself in the market. One major change is transforming the sales market from a printed version of books to an e-book platform. (University of Phoenix, 2007) Primary focus is on developing a full service site. This paper will identify companies that have faced specific issues related to those identified in the Harrison Keyes scenario (University of Phoenix, 2007) and related to the concepts of enter concepts. For each company the paper will: discuss the following the issue identified in the scenario that is also facing the company, how the company responded to the issue, and outcomes of the companyââ¬â¢s response to the issue. Additionally, the paper will provide an analysis that synthesizes the key findings. The analysis will identify the key course concepts and compare and contrast the practices of each company related to those concepts Research Summaries NASA NASA, being a government agency that utilizes the expertise of private firmsà for many of its projects, has developed a very articulate request for proposal (RFP). Requirements and features must be in enough detail that contractors have a clear description of the final deliverable that will meet the customerââ¬â¢s needs. In most cases the RFP also specifies an expected format for the contractorââ¬â¢s bid proposal so the responses of different contractors can be fairly evaluated. (Gray & Larson, 2005, p 52) This is to avoid ambiguity and provide an even ground to start from. When comparing the utilization of a RFP with NASA and Friar Tuck (FT), one may note that if FT had implemented a RFP, many of the issues the organization is facing could have been alleviated. With FT, a RFP delivered to all vendors involved would have provided the starting blocks of the project. Each company would have an opportunity to bid on a project that best suites their needs and expertise. This way, when the selected project is under way, the involved parties have an active, vested interest in the success and outcome of the project. HKIA Hong Kong International Airport was built with the expectation that there would be a large volume of travelers and goods going in and out on a daily basis. As demands increased, the airport authorities had the competence to create a work breakdown structure (WBS). The early stages of developing the outline serve to ensure that all tasks are identified and that participants of the project have an understanding of what is to be done. Once the outline and its detail are defined, an integrated information system can be developed to schedule work and allocate budgets. This baseline information is later used for control. (Gray & Larson, 2005, p.99) If Friar Tuck had implemented a WBS, there would have been a clearer understanding of the deliverables, the costs associated with each deliverable, when each deliverable was due and who was responsible for ensuring that his or her assigned work was completed within the allotted timeframe. Because budget and tasks appeared to be important, FT found itself with the goal but lacking details of how to execute. Because HKIAà realized that their greatest assets are the passengers and cargo, the organization took great lengths to ensure that the daily business transactions were not disrupted and that the airport was able to meet the demands placed upon it by its customers, which include passengers and cargo. Project planning at Harrison-Keyes has progressed to develop a WBS which is being worked within their existing functional organizational structure. A status check on Harrison-Keyesââ¬â¢s tact to project management demonstrates difficulties with task completion and employee behaviors. Organizations that choose to manage projects within their current functional structure face uphill battles between the functional silos. Project management within existing functional organizational structures has known advantages and disadvantages. Advantages are no changes, flexibility, in-depth expertise and easy post-project transition. Disadvantages to managing projects in this manner are lack of focus, poor integration, slow, and lack of ownership (Gray & Larson, 2006, p. 58). Harrison-Keyes has alternatives to consider provided the pros and cons of managing projects within their existing functional structure and the United States Department of Defense provides a benchmark for consideration. Conflict with the authors at H-K has to be addressed for a successful transition to an e-publishing company. As a benchmark examination for H-K, the DOD is commonly faced with large and complex project implementations which are riddled with differences of opinion between the functional branches of the service, the Army, Navy, and Air Force. Three types of conflict were identified by the DOD case along with effective measures to counteract the friction. The DOD recognized three types of conflicts which hindered project implementation in a functional organization, which are 1) interpersonal-based conflict, 2) task-based conflict and 3) process-based conflict (Sutterfield, et al, 2006). In response to identifying three broad-based conflict classifications the DOD case study created effective strategies to address them while managing projects. Interpersonal-based conflicts within projects at the DOD are addressed with a strategy to compromise and build collaborative relationship to create win-win discussions between functional areas (Sutterfield, et al, 2006). Task-based conflict resolution relies on an effective project manager navigating aspects of stakeholderââ¬â¢s position, power or influence. As a project manager evaluates these factors a determination can be made to deploy a competing, collaborating, or compromise strategy to effectively manage the project (Sutterfield, et al, 2006). Process-based conflict resolution is more clear-cut due to the sequential requirements of projects. A heavier-handed approach towards stakeholders is required as less flexibility can be allowed in order to move the project forward. If flexibility is allowable within the project step a more collaborative approach can be considered (Sutterfield, et al, 2006). A second case study of Honeywell, Inc. provides H-K the possibility of determining is a process breakdown structure (PBS) is better suited than the currently developed work breakdown structure (WBS). First let us look at 10 lessons learned by Honeywell in the use of PBS and contrast the potential into Harrison-Keyesââ¬â¢s situation. It is important to note that both Honeywell and H-K are faced with a dramatic shift in their business strategy. Honeywellââ¬â¢s experience yielded 10 lessons when attempting to make radical change in an organization. The 10 lessons learned were (Paper, et al, 2001): People are the key enablers of change Question everything People need a systemic methodology to map processes Create team ownership and a culture of dissatisfaction Management attitude and behavior can squash projects Bottom-up or empowered implementation Redesign must be business-driven and continuous IT is a necessary, but not sufficient, enabler Set stretch goals Execution is the real difference between success and failure Similar to Harrison-Keyesââ¬â¢s competitive requirements to change from print publishing to e-publishing, Honeywell faced competitive pressures to reduce defects by 1000% and production cycle-time by 500%. Honeywell dramatically changed its method at project implementation within the organization to accomplish successful results. Honeywell eliminated project management by tasks and details (WBS) in favor of managing them by process-orientation (PBS). Honeywell learned two key lessons in project implementation of radical change. The first lesson learned through Honeywellââ¬â¢s project execution through PBS was execution separates high performers from less successful PBS projects. The second lesson learned by Honeywell involved the identification of the difficulty of change not being sufficient enough. A critical step was to change the vision of the organization to reflect the radical change (Paper, et al, 2001). In addition to these two lessons, Honeywell experiences the behavioral aspect of project management as H-K. Honeywell found that successful execution is dependent upon behavioral change. Behavioral change was found to be extremely difficult and required time to be successful. The need for time is often in conflict with the need for quick profits (Paper, et al, 2001). Middle management was found to be the most resistant to change because of their dedicated knowledge skill-set versus process skill-set. Strong training programs coupled with pay for performance plans to provide financial incentives helped Honeywell overcome this hurdle. PayPal and YouTube are two of the biggest success stories since the dotcom bubble burst sometime after the year 2000. For all of the companiesââ¬â¢ success there has been a well publicized and scrutinized series of shortcomings and fallbacks that could have been avoided. Risk management practices would have prepared these companies for growth and expansion while building their brand and could have possibly eliminated the issues they face today. PayPal burst onto the ecommerce scene in 1999. The public was just beginning to embrace the idea of shopping for everyday goods and services online but were weary about giving personal information to strangers or having to repeatedly enter credit card information online. PayPal was able to offer consumers a simple web interface and piece of mind by storing that sensitive data and allowing buyers and sellers to seamlessly complete online transactions. The thought that the website would explode and be the defacto payment solution on the Internet did not cross the designers mind and the risk was not properly assessed. PayPal has lost many customers and has had to fight many lawsuits due to poor planning. Even now PayPal is struggling to catch up while meeting the demand for their service. This same lack of a contingency plan could doom H-K as they search for a firm to perform the digital formatting. YouTube has become one of the most visited websites on the Internet in just a few short years. While the designer can be proud for creating a forum for all things video, he too can be blamed for not creating a risk management plan. Allowing users to upload their own videos has opened the door to copy righted material being available on the website. The networks and studios are slow to embrace the sight as a way to increase brand awareness and have instead blamed YouTube for declining revenues and ratings. Entire movies have been made available on the website the same day they were released in theaters providing some credence to the argument against the site. Had YouTube made alliances with the studios and networks prior to allowing users to upload videos they could have avoided the problem altogether. This is another case of a business that did not properly mitigate and assess the risk and now face the threat of lawsuits. General Electric and Siemens have successfully dealt with the issues of a corporate culture that negatively affected their ongoing project structure as well as a culture that fostered individuality among the various departments. HK faces similar problems in that its culture is one of individualism and lack of accountability that has led to a lack of consensus among its leadership as to project management structures, organizational culture, and creating and communicating that culture throughout the organization. GE has created a corporate culture that is not individualized by department like HKs, but instead involves all members of the GE community, its management, its employees, and its customers. GE has also created a corporate culture that dismisses politics as a means to an end. Through the changes to culture and GEââ¬â¢s team approach they were able to organize projects as dedicated teams within functional organizations. Culture has a considerable affect on the success rate at HK as well. Their organizationââ¬â¢s culture has affected their projects. HK has hired a new CEO, Meg McGill, to move them into the strategic direction of ââ¬Å"all things digital.â⬠However, nowhere in her emails and correspondence among HKââ¬â¢s leadership did she address the change in culture in order to affect her strategic business objectives. Meg needs to implement a change in corporate culture like that of the CEO of Siemens Klaus Klenfeld, who changed the corporate culture to one where ââ¬Å"Everyone, including the boss, is accountable. We commit to something, and we deliverâ⬠(Ewing, Jack, 2007). In order to affect this, Klenfeld has had to deploy hard tactics. By implementing a changed culture that emphasizes accountability, project management structures will become more thorough and thought out. Organizational culture and structure influence project management more than HK realizes. Addressing the culture of HK to one of accountability will go a long way in formalizing project management structures such as organizing those projects through dedicated teams; organize projects within the functional organizations of HK; organizing projects within a matrix; or organizing projects within their network organizations. GE organized their projects through dedicated teams, Siemensââ¬â¢ organized byà deploying hard tactics within their functional organizations, and both implemented these project structures by changing their corporate culture. HK must take similar steps in order to realize their business objectives. When University of Phoenix decided to implement e-books into the on-line learning system, the management knew that they needed a corporate strategy and high quality project management. The lack of the above mentioned items meant costly lawsuits because a great deal of authors opposed the idea of e-books due to possible fraud and copyright violations. The company was not able to avoid lawsuits, but it was able to protect itself from future legal issue and establish ground for strong digital contend on the learning websites. It was made possible by clearly defined corporate strategy and project management: In the lawsuit, filed in U.S. District Court in Atlanta, Patrick G. McKeown, alleges that the Thomson Corporation and two of its subsidiaries sold a customized electronic version of his book Information Technology and the Networked Economy, to the _UNIVERSITY_ of _PHOENIX_ , which in turn has sold more than 23,000 copies to its students. (Mr. McKeown says an updated royalty statement he recently received shows that the actual number of copies sold by _PHOENIX_ is now about 45,000.) Thomson did not return telephone calls seeking comment. In a written statement, _PHOENIX_ said that it it honors intellectual-property rights of others, adding, ââ¬Å"the _UNIVERSITY_ licensed its rights to use the textbook from a reputable, well-known publisher, that represented it had the appropriate rights to the book. (Chronicle of Higher Education, 2007). OnStar OnStar represent an example of Risk Response Development. OnStar provides wireless access to emergency and security services from General Motors vehicles. A great number of opponents were claiming that OnStar collect personal information and might use it for marketing and other purposes. The company did not try to refute the statement since the information is indeed collected: ââ¬Å"You start [collecting] individual pieces of information thatà seem benign,â⬠he says. ââ¬Å"But when you begin to combine bits of information it becomes less and less so.â⬠White says OnStar, in storing data only in aggregate, is walking a fine line. ââ¬Å"It is disingenuous to talk about aggregate data when you have the ability to differentiate it,â⬠he says. There may not be a business case for creating individual profiles today, White adds, but there may be someday, and thatââ¬â¢s when potential privacy violations will become a concern,â⬠(CIO, 2006). The response of General Motors to the risk of having the idea of OnStar shot down due to the information collection, provoked the company to issue the results of the survey which indicated regardless of the above fact, a great number of drivers still choose OnStar: ââ¬Å"Itââ¬â¢s clear from the survey responses that women are looking for ways to enhance their peace of mind when driving ââ¬â whether itââ¬â¢s a long-distance solo car trip, the daily commute to work, or simply ferrying kids to and from after school activities,â⬠notes Chet Huber, OnStar President. ââ¬Å"Ninety four percent of female subscribers say that OnStar provides peace of mind when theyââ¬â¢re traveling alone; 87 percent of female subscribers say that OnStar provides that peace of mind when loved ones are traveling. And more than 70 percent of OnStarââ¬â¢s female subscribers tell us they prefer or will only purchase an OnStar equipped vehicle,â⬠(OnStar, 2006). Researched Company Synopses Since its inception in 1958, NASA has accomplished many great scientific and technological feats in air and space. (NASA, 2007) With its continued research and development programs, the agency has provided a renewed interest in space, the planet and the environment in general. _Identified Issue_ NASA, in its quest to improve manned space exploration, began the Ares project. The project was initially created to develop a rocket that would enable astronauts to travel to the moon and eventually to the planet Mars. The organization needed assistance from private firms in the creation of the avionics unit that crew members would use to control navigation, guidance and other hardware (NASA, 2007). The organization needed to implement a process that would ensure that the winning contract go to the most qualified company. _Response to the Issue_ In order to ensure that the organization brings in a highly qualified private firm that knows exactly what NASA is looking for, the project team created a request for proposal. The issues covered included, but not limited to: 1. Synopsis of requirements and request for action 2. Statement of work (SOW) detailing the scope and major deliverables 3. Deliverable specifications/requirements, features, and tasks 4. Responsibilitiesââ¬âvendor and customer 5. Project timetable 6. Costs and payment schedule 7. Type of contract 8. Experience and staffing 9. Evaluation criteria (Gray & Larson, 2005, p. 52) _Outcome_ The organization, after interviewing and investigating several well qualified firms, hired the Boeing Company to provide support for both design andà production. Crew transportation to the International Space Station is planned to begin no later than 2014. The first lunar excursion is scheduled for the 2020 timeframe. (NASA, 2007) In 1998, the Hong Kong International Airport was opened. The construction took 6 years and cost upwards of $20 Billion USD. Although constructed under British Colonial rule, the airport began operations under Chinese law. As the worldââ¬â¢s fifth busiest international passenger airport and most active worldwide air cargo operation, HKIA sees nearly 800 aircraft take off and land every day. (Hong Cong Airport, 2002) _Identified Issue_ As business increased at the airport, the ramp-handling operations began to experience delays. Aircraft ramp handling refers to services on the ramp for an aircraft. It includes loading and unloading of baggage, air cargo and air mail onto the aircraft, and transportation between the aircraft and the passenger terminal, air cargo terminals and the air mail centre. In addition, ramp handling services cover preparation for delivery onto aircraft of bulk baggage and baggage containers, aircraft loading bridge operation, and passenger stairs operation. (Hong Kong Airport, 2002) The airport needed a solution that would not impact the daily operations of the facilities. _Response to the Issue_ The HKIA leadership team decided to implement a wireless-enabled ramp management solution, which would enables control room staff to monitor the entire airport using computer terminals, links to airport-specific databases and existing IT infrastructure, as well as covers finance and accounting. (Hewlett Packard, 2003) In order to roll out the project, a work breakdown structure needed to be implemented. This allowed the team to align itself with the scope, define deliverables, create work packages, and assign specific duties to all involved. _Outcome_ The wireless-enabled ramp management solution integration was completed without incident, as far as daily airport activities were concerned. The ability to apply the technology into the existing wired network infrastructure meant that employees had options in accessing data, and performing their duties. Ramp workers are able to receive current operational information, more efficient utilization of employees throughout the facility, increase security through more accurate recording, and the ramp-handling operations has been able to keep up ith demands. (Hewlett Packard, 2003) _Identified Issue: Defining the elements of a project ââ¬â Process Breakdown Structure (PBS)_ When an organization faces projects to produce tangible outcomes, such as design and building, WBS is an ideal way to attack them. When radical change in an organization is needed through a series of steps or phases, PBS is best suited to complete the project (Gray & Larson, 2006). Harrison-Keyes is striving for a radical change in transitioning the operation from print publishing to e-publishing. Honeywell, Incorporated serves as an example for Harrison-Keyes to benchmark as a similar radical change faced Honeywell in 1989. Harrison-Keyes should note that to drive transformational change as Honeywell accomplished then a PBS approach over a WBS should be considered. Honeywell began to experience lagging performance results in its industrial automation and control (IAC) business unit. Global competition was requiring manufacturing firms to operate more efficiently than ever before while maintaining high levels of quality. Customers from around the world ranging from refineries, chemical plants and paper mills purchased Honeywellââ¬â¢s TDC 3000X system to achieve world-class process control capability. Defects, production cycle-time and materials management had to be improved to remain competitive (D. Paper, J. Rodger & P. Pendarker, 2001). _Response to the Issue_ As a result of lagging performance a world-class -manufacturing (WCM) program was undertaken over a three-year period. Radical improvement measures were established to reduce defects by 1000% and production cycle-time by 500% (Paper, et al, 2001). To accomplish these revolutionary results Honeywell focused on processes and not detailed tasks. Multi-skilled workers in charge of building entire products or modules were favored over individual workers in a functional department. Resources were assigned to processes over individual tasks and factories were shut down for an intensive 6-hour training session with the need for radical change emphasized (Paper, et al, 2001). Employee motivation was instituted through pay for performance plans which were tied to salaried workers performance reviews. Through Honeywellââ¬â¢s PBS experience 10 key lessons of success were identified but two of them were identified as most critical to the outcome. _Outcome_ The two most critical lessons learned the Honeywellââ¬â¢s adaptation of radical change through PBS were 1) execution separates high performers from less successful PBS projects and 2) identification of the difficulty of change is not sufficient enough. The vision of the organization must change to reflect radical change (Paper, et al, 2001). The Honeywell PBS experience found that execution that is dependent upon behavioral change is extremely difficult and requires time to be successful. This is often in direct conflict with an organizationââ¬â¢s requirements for quick profits and impatience (Paper, et al, 2001). Honeywell found the largest obstacle to successfully implementing via a PBS method was middle management resistance. Employees in middle management positions were notorious for being experts in their specific areas and the transition to a process expert over a specific functional area proved difficult to accomplish. With strong training programs and incentives, Honeywell overcame this hurdle and was eventually successfully in transforming the organization. _Identified Issue ââ¬â Project Management Structure within a Functionalà Organization_ Organizations often approach project management through their existing functional hierarchy and Harrison-Keyes is no different as it proceeds towards its adaptation to e-publishing. When organizations choose to adapt project management within their existing hierarchical structure they accept the disadvantages of this choice, which are lack of focus, poor integration, slowness, and lack of ownership (Gray & Larson, 2006). Similar to Harrison-Keyes, the Department of Defense (DOD) operates project management within a functional hierarchy with a dedicated project manager coordinating traffic. In a case study of the DOD project, Light Amphibian Heavy-Lift (LAMP-H), the disadvantages parallel Harrison-Keyesââ¬â¢s situation. The DODââ¬â¢s experience serves as a benchmark for failure of project management within a functional organization and proposes a model to offset the issues that are encountered. _Response to the Issue_ Managing projects in the DOD has been described as the most complex process and the most difficult to manage under the best circumstances (J. Sutterfield, S. Friday-Stroud & S. Shivers-Blackwell, 2006). The Navy, Army, and Air Force compromise a formidable functional structure to navigate when coordinating projects. In the case of the LAMP-H project, three types of conflicts were identified in hindering the project, 1) interpersonal-based conflict, 2) task-based conflict and 3) process-based conflict (Sutterfield, et al, 2006). In response to identifying three broad-based conflict classifications the DOD case study created effective strategies to address them while managing projects. _Outcome_ Interpersonal-based conflict within the LAMP-H project was addressed with a strategy to compromise and build collaborative relationships. When the Army, Navy, and Air Force created win-win discussions all details of the LAMP-H project were agreed upon which resulted in a successful outcome of theà project (Sutterfield, et al, 2006). Task-based conflict resolution strategy is dependent upon the project manager and stakeholderââ¬â¢s position, power or influence. As a project manager evaluates these factors a determination can be made to deploy a competing, collaborating, or compromise strategy to effectively manage the project (Sutterfield, et al, 2006). Process-based conflict resolution cannot be influenced by a project manager because of the sequential requirements of a project. In other words, a step has to occur in its proper order for the project to proceed. This affords a heavier-handed approach towards stakeholders as less flexibility can be allowed in order to move the project forward. Because of this constraint, a competitive strategy to resolve conflict can be deployed. If flexibility is allowable within the project step a more collaborative approach can be considered (Sutterfield, et al, 2006). Successful conflict management within a project managed within a functional structure increases the likelihood of a project meeting with success. The Project-Conflict Management Framework developed from the DOD LAMP-H project provides Harrison-Keyes a methodology to resolve the current e-publishing project issues. Risk Management YouTube is a popular website that allows people to post, watch, and share video clips at no charge. While the site was initially created so the designer could share home movies with family members in other states, the website quickly caught on with the public and was soon a household name. In October, ââ¬Å"2006 web juggernaut Google purchased YouTube for 1.65 Billion dollars in stock. Some analysts considered this a risky investment despite the 100 million plus page views YouTube receives dailyâ⬠(Wood, 2006). In an article written by Daniel Wood of the Christian Science Monitor ââ¬Å"many users cannot get enough of the idea and love the service because it is entertaining, informative, and a community of sharing things; but some concepts are too good to be trueâ⬠(Wood, 2006). At any given time there are thousands of copyrighted videos illegally being shown on YouTube. ââ¬Å"YouTube completely missed the boat by not immediately identifying the risk withà allowing user s to upload videos and making deals with television and movie studios prior to launching their serviceâ⬠(Wood, 2006). As news of lawsuits and boycotts spread YouTube has taken the stance that they have done everything in their power to remove copy written material from their site and are also quick to remove any videos that receive a complaint from a studio. Some users have received ââ¬Å"cease and desistâ⬠letters from attorneys and are facing the threat of a lawsuit if copyrighted material is not taken down immediately (Wood, 2006). Recently Viacom announced a one billion dollar lawsuit against YouTube while accusing the company of ââ¬Å"massive intentional copyright infringementâ⬠(Cashmore, 2007). Although consumers seem to enjoy the unlimited access to copyrighted work, other stakeholders are not dealing with the risks as well. ââ¬Å"YouTube board members are becoming increasingly skeptical and worry that impending lawsuits and copyright issues will eventually do them in since they are not making a large profitâ⬠(Cashmore, 2007). In their defense the creator of YouTube obviously did not recognize the scope of this type of service in when it was created but Google knew exactly what it was getting into when they purchased the upstart company. Prior to Google purchasing YouTube members of the media predicted that the lack of a business model and persistent problems with copyright would eventually bankrupt the company (Murray, 2006). Analysts were comparing YouTube to Napster and imagined the company would soon suffer the same fate as the former king of peer to peer file sharing. Google undoubtedly recognized the risk involved with their purchase of YouTube but due to their deep pockets they can afford to take that risk while working on a solution that can appease copyright holders. ââ¬Å"What Google ultimately wants to do is work with the companies that have their material posted by others on YouTube and give them a share of the profit from the add revenueâ⬠(Taylor, 2006). In order to do this, Google will have to create a suitable formula forà projecting profit amounts and will then have to negotiate with outside companies for agreements to avoid future legal issues. The explosion of YouTube has excited consumers and led some to predict the end of television reign, but has caused issues for almost everyone else involved. There was no legal and/or contractual consideration involved during the design or implementation stage and a strategy was never identified along with the project details. Paypal eBay Incorporated is an online auction site that provides buyers and sellers a place to trade goods and services for a fee. The site has made billions of dollars as the worldââ¬â¢s largest online marketplace without the use of warehouse space, inventory or salespeople. ââ¬Å"Pierre Omidyar initially launched Auction Web during the infancy on the Internet in 1995 as a market to sale collectible and rare items. Auction Web incorporated and changed its name to eBay in 1996 as the site was becoming more popular with everyday usersâ⬠(Marketline, 2006). ââ¬Å"PayPal was launched in 1999 under the name Confinity. The idealistic vision of the company was one of a borderless currency free from governmental controls. However, PayPalââ¬â¢s success quickly drew the attention of hackers, scam artists and organized crime groups, who used the service for frauds and money launderingâ⬠(Grabianowski, 2007). The payment system also caught on with the online auction community who found it was a safe and easy way to make and receive payments for goods without having to exchange sensitive information. ââ¬Å"The site grew far too quickly for PayPal to handle and as a result the customers sufferedâ⬠(Marketline, 2006). PayPal was able to work out some of the issues associated with their system but for the most part were always one step behind the criminals and scammers that were a threat to their system. When dealing with money it is essential that the risk is evaluated prior to taking the first transaction. In July of 2002 eBay announced they were purchasing PayPal for 1.5 billion dollars and phasing out their own fledgling payment section. While some buyers and sellers assumed PayPal is a bank, PayPal is actually an ââ¬Å"account based system that services approximately 96 million total accounts which are available to users in 55 markets. The 96 million total PayPal accounts include approximately 19 million business and 77 million personal accountsâ⬠(Marketline, 2006). With the deep pockets of eBay behind the company, analysts assumed the security issues that were prevalent in the early days of PayPal would be quickly shored up. In fact fraud was occurring on the PayPal system in record amounts and the payment system soon caught the eye of government regulators who were forced to step in and investigate. ââ¬Å"Regulators and attorney generals in several states, including New York and California, fined PayPal for violations and investigated the companyââ¬â¢s business practices. Some states, such as Louisiana, banned PayPal from operating in their states altogetherâ⬠(Grabianowski, 2007). PayPal has since received licenses to operate in those places but lost millions of dollars in revenues while playing catch up. Security issues were not the only problem that PayPal faced. In October of 2004, PayPal experienced five days of power outages. The Sudbury Star reported ââ¬Å"continued intermittent service outages despite furious attempts to repair the ailing online payment serviceâ⬠(2004). There was a possibility that due to the system setup and lack of redundancy across the network that many accounts could be lost forever or profoundly affected by the power losses. PayPal had been playing catch up since 1999 and it looked like time was catching up to them (Grabianowski, 2007). The power outage allowed some but not all users to complete online transactions which resulted in a very frustrating situation for PayPal users and caused some long time account holders to close their PayPal account altogether. General Electric Co. ââ¬â Robert Reimer _Similar Issue Facing GE/Harrison-Keyes Publications, Inc._ Harrison-Keyes is faced with a corporate culture that is different for everyà department. Corporate politics has become the norm for any decision that needs to be made. The corporate culture that fosters a self-centered approach to results, coupled with their lack of project structure, has led them to failure to implement a successful plan. General Electric (GE) has faced similar issues in its long and illustrious history and has dealt with them in such a way as to build GE into a recognized global business leader. Specifically, GE has the following issues that they have confronted and that currently face Harrison-Keyes: 1. Aligning organizational culture with project structure to achieve their corporate strategies, 2. Create a corporate culture that does not tolerate corporate politics and in fact breeds a cooperative environment between management, employee, and the customer/authors, and 3. Identifying managers that are talented and that will perform. _GEââ¬â¢s Response to the Issues_ The former CEO of GE, Jack Welch, the recognized business leader of the twentieth century, has stated that ââ¬Å"If you ran a baseball team, who would you want to hang out with? The head of player personnel or the chief accountant?â⬠(ABA Banking Journal, 2006) Mr. Welchââ¬â¢s point is that in business, as in sports, ââ¬Å"the whole game is about talent-whoever fields the best team wins. Nothing you do is more important as building talent. Spend at least half your time developing peopleâ⬠(ABA Banking Journal, 2006). GE has developed managers by encouraging their input and to take chances and by allowing them to take on leadership roles at young ages. GE has relied on performance based initiatives and rewards risk takers. As Mr. Welch has stated, ââ¬Å"give people chances to try new things and run things when theyââ¬â¢re 30, not 50. If youââ¬â¢re spending all your time developing good peopleâ⬠¦they canââ¬â¢t wait for you to grow. If you want new things, take care of the people who try things. When they make mistakes, praise them, or theyââ¬â¢ll become afraid to make mistakesâ⬠(ABA Banking Journal, 2006). GE has also created a corporate culture that dismisses politics as a means to and end and encourages a culture that involves all members of the GE community, its senior management, its employees, and its customers. GEââ¬â¢s success has developed based on defining a corporate culture that is not individualizedà by department but practiced as an organization. GEââ¬â¢s success is based on the acronym LATIN. Leadership in that they make sure they have the right leaders for the job and at the right time; Adaptability by developing flexible strategies; Talent by investing in high potential people; Influence by being a company that is proactive instead of reactive, and ; Networks wherein expectations are met by maintaining discipline and consistency (PR Newswire, 2007). This creates a corporate cult ure that advances the overall goals of GE, involves the entire company, and results in not only the creation, but the communication of this culture that results in projects that are structured for success. _Outcomes of GEââ¬â¢s Response_ The result of these corporate strategies and the resulting project structure has been that GE increased production and has empowered its employees by adopting a program called Work-Out (Dââ¬â¢Oââ¬â¢Brian, Joseph, 1994). This program was created following a ââ¬Å"town meetingâ⬠format wherein employees at all levels are asked to gather and solve problems, ultimately coming up with solutions to specific problems. When they are done they pass these along to senior management. This program has changed the relationship between management-employee in several ways, the most significant being that it has ââ¬Å"horizontalized the company to some degree: Individual accountability for specific products and functions is maintained, but it is easier for any employees to take a hand in improving the making or doing of any specific thingâ⬠(Dââ¬â¢Oââ¬â¢Brian, Joseph, 1994). As to identifying managers that are talented and will perform, Jack Welch, former CEO of GE advocates the policy of firing the worst performing staff on a yearly basis. Although controversial, this tactic has not harmed GEââ¬â¢s performance (MacAskill, John, 2007).
Subscribe to:
Posts (Atom)