Thursday, October 31, 2019

Minimalism, functionalism and neo-eclectic Essay

Minimalism, functionalism and neo-eclectic - Essay Example Another important feature was simplicity. Simplicity was introduced to make a structure appear more natural and thus more livable. Smallness and simplicity thus became the core of minimalist aesthetics and have been associated with such important names as Walter Gropius, Alberto Giacometti, Laszlo Moholy-Nagy, Henri Gaudier-Brzeska, Constantin Brancusi, Le Corbusier, and Ludwig Mies van der Rohe. Barth has explained the minimalist doctrine in these words: "artistic effect may be enhanced by a radical economy of artistic means, even where such parsimony comprises other values: completeness, for example, or richness or precision of statement". Functionalism as the word suggests is a movement that focused on utility of structures. It was felt that a structure must do what it is intended to do. And while the movement may have suffered from ambiguity, no one can seriously deny the effectives of the basic doctrine of functionalism. Every object must be created to perform that job it is int ended to perform. It was felt that each part of a structure must serve a purpose. It was a rather austere and neutral approach to building as if a work of art was suddenly stripped off its soul. While utility was an important characteristic and one that even modern architects cannot ignore, basic aesthetic values were largely ignored and this gave rise to criticism. It was argued that if utility is taken a bit too far, things other than utility may take a backseat and thus the entire approach suffers. This has been interestingly explained by Pile (1979) in these words: "Simplistic discussions of function in design often lose sight of the complexity of multiple functional requirements that characterize the development of most modern objects. If one supposes that each thing has a function, it can seem that discussions of this matter are pointless. The definition of a chair, after all, requires that any chair can be sat in. Similarly, all knives must cut, airplanes fly, and failure in this kind of primary function dooms an object to total failure and, in all probability, to the junk heap. In practice, every object has, in addition to the obvious primary function, many other subsidiary

Tuesday, October 29, 2019

DQ 6 DQ 7 WEEK 5 Essay Example | Topics and Well Written Essays - 500 words

DQ 6 DQ 7 WEEK 5 - Essay Example Questionnaires were not used because key personal determining the factors affecting price changes in gasoline were not accessible. Other data was collected by use of interview with gas station personnel to gather their opinion concerning possible causes for changes in gasoline prices. Good research provides new and unbiased findings that are important to some entity or group of people. The researcher must not benefit from the outcomes and findings of the study as to ensure the integrity of the findings. Information collected should best represent the population that is being tested. Good research utilizes the most appropriate data collection methods such as surveys, interviews, etc and research tools such as periodical indexes, databases, and Web sites. If the null hypothesis is not rejected, then the alternate hypothesis is not accepted. This is because the initial sample of 30 records did not provide enough information to infer statistically significance of the hypothesis to be tested.

Sunday, October 27, 2019

Horizontal Gene Transfer In Prokaryotes Biology Essay

Horizontal Gene Transfer In Prokaryotes Biology Essay Horizontal gene transfer is also known as lateral gene transfer and it is the phenomenon of gene transferring between prokaryotic organisms like bacteria, virus etc. HGT is a general gene transfer method among bacteria-like microorganisms (Archaea). Bacteria and Archaea possess a primitive nucleus hence they named as prokaryotes and differentiated from a eukaryote cells by lacking a perfect nuclei. In horizontal gene transfer process, an organism acquiring genetic material from other organism and do not producing the offspring of that organism. This process is different from vertical gene transfer (mainly occurring between eukaryotes) in which the incorporate gene of one organism produces offspring of the donor organism. Even distantly related bacteria can achieve a genetic feature from other bacteria by horizontal gene transfer mechanism. For example, the increased drug resistance capability of different bacterial strains. Horizontal gene transfer is occurred through three different mechanisms. Transformation, transduction and bacterial conjugation are the three mechanisms for genetic transfer in prokaryotes. Among them bacterial conjugation allows bacterial gene transfer by cell to cell contact. Transduction process can be achieved by moving DNA from one bacterium to a different one by using bacteriophages. Laboratory experiments followed by a issue in Vietnam (in 1996) due to the antibiotic chloramphenicol resistance threatening of a pathogenic bacteria called Meningococcus exhibited the chloramphenicol resistance genes similarities with a previously identified genes (Tn4451) of Clostridium perfringens. Meningococci is an entirely different bacteria from Clostridium which is a Gram positive and anaerobic one. HGT also occurs in eukaryotic protist organisms and is a major characteristic of microbial evolution. DNA sequence analysis of different prokaryotic cell genomes has revealed that such genomes usually include conserved resolved genes which are liable to disruption by DNA islands. Such DNA islands can alter comparatively during prokaryote evolution by incorporating foreign DNA due to insertion, deletion incidents. Studies about resistance genes to antibiotics make available convincing proof for wide inheritability of genes between taxonomically different microbial strains. Horizontal gene transfer can lead to the generation of new antibiotic resistant pathogens strains. This shows, gene transfer and recombination processes creating another pathogenic strains and it is an example for prokaryotic evolution by transferring genes horizontally. Antibiotic resistance is achieved through horizontal transfer process. It has proved experimentally in transformation competent bacterium Streptococcus pneumoniae . Plasmid and transposon exchange of resistant acquired bacterial cell can alter the genomes of recipient organism through recombination of new resistant genes via insertion process. Insertion of the new gene in to a main chromosome is carried out by some mechanisms directed by these transposons. Role of horizontal gene transfer in bacterial evolution The researches of Joshua Lederberg about the natural evolution of pathogenic flora of the gut bacterium (E.coli K-12 strain) in 1948 afford a better understanding of prokaryotic evolution and the importance of horizontal gene transfer in evolutionary studies of bacteria. DNA transformation has been demonstrated in different bacterial species consisting Streptococcus, Haemophilus, Bacillus, Cyanobacteria and Rhizobium species. Transmissible plasmids in microorganisms Lambda and fertility factor based studies in bacteriophages by Lead Francois Jacob et al in 1958 has shown that the insertion of various genetic structures -episome of DNA- into bacterial chromosomes could alter modes of existence within the cell. Study of these aspect exhibited several occurrences of mobile DNA in a vast range of microorganisms (jumping genes) which permit F plasmid insertion and distributed horizontal gene transfer implying by bacteriophage, plasmids and mobile DNA. Conserved DNA genome sequences shows E. coli like bacteria have genomes which bears conserved backbone genes altered by foreign DNA inserts acquired during evolution by addition of genes gradually. Genomic researches propose that the past natural events of horizontal gene transfer acclaims interpretation of previous events in evolution of cells and the nature of the common ancestor of life. Lateral gene transfer between prokaryotes and multi-cellular eukaryotes The elaborated genome sequencing studies and researches have produced proof for lateral transfer of genes between prokaryotic and eukaryotic genomes. As mentioned earlier, lateral gene transfer (LGT) has an important role in the evolution of prokaryotes and unicellular eukaryotes. Lateral gene transfer between prokaryotes and multicellular eukaryotic organisms show a contradiction from the above statement. This is more arguable nowadays. Evidence has gathered up for bacterial symbiotic origin within genomes of eukaryotes. For example, roughly complete copies of bacterial symbiont Wolbachia genome in the host nuclear genome. Anyway, there is no proof available to understand whether the transferred copies of the genes are functional in the eukaryotic genome or not. For instance, some minute manifestations have been found for some transferred genes. Their inclination seems to be identical to that of recently transferred mitochondrial genes to the nucleus. Lack of function of such genes may lead to their degradation. Recent researches have given a better understanding about the transferred prokaryotic genes function and their expression in the eukaryotic recipient. Studies of Nikoh and Nakabachi demonstrate that the pea aphid Acyrthosiphon pisum prone to have achieved two genes from bacterial strains. These give rise to probably been acquired without help commencing facultative derived symbionts: one on or after Wolbachia or a definite relative, the additional starting an undescribed bacterium. The authors broaden operate to facilitate these genes are both extremely articulated in the bacteriocytes, expert cells that cherish the aphids oblige core symbiont Buchnera aphidicola. Buchnera, which has a strappingly shortned genome, lacks these two genes, while other bacteriae, counting Buchneras similar free-living relatives, acquire these genes. Such two genes may be functionally indispensable to uphold Buchnera creating the nuclear inserted replicate a biting contender for existence functionally active. In addition, functionality is indirect by the opinion with the purpose of the bacterial basis is not currently offer in the aphid implying with the purpose of the assign is not recent and pseudogenization may be predictable in the absence of clear mixture for function. The aphid review is one of numerous fresh identification recitations on the side pass on in symbiosis. Rumpho et al. create substantiation for LGT amid two eukaryotes, the alga Vaucheria litorea and its predator, the sea slug Elysia chlorotica by consuming on V. Litorea. E. chlorotica obtains the algal plastids, which remain to photosynthesize for months in the sea slug. This is surprising, for the reason that the bulk of proteins required for photosynthesis are encoded on the algal nuclear genome. Rumpho et al. now speculate with the purpose of the sea slug capacity effectively assert photosynthesizing chloroplasts since it has acquired vital genes by LGT beginning the algal genome, and they grant proof for LGT of a nuclear RNA from prey to predator. They whats more exhibitions to the genetic material is articulated in the sea slug. Two additional researches pass on to an earliest LGT occurrence between mosquitoes and the endosymbiont Wolbachia pipientis. The concerns of exchange genes encoding salivary gland come out (SGS) proteins of mosquitoes, which possess a position in insect-Plasmodium interactions. Same genes have identified in two of the six sequenced Wolbachia genomes. The mechanism and role of the DNA in Wolbachia is unfamiliar, but it has diverged substantially on or after its mosquito equivalent, is not pseudogenized and is expressed. No equivalents in new prokaryotic or eukaryotic arrangement databases obtain been found. The guidance of transport (from bacterium to mosquito or commencing mosquito to bacterium) relics uncleared. Accumulating prokaryotic DNA and genome sequences expose with the aim of the swap of genetic details via together homology-dependent recombination and horizontal (lateral) RNA transport (HGT) is far new important, in magnitude and value as until that time imagined. The accepted view, with the intention of prokaryotic evolution can be tacit primarily in provisos of clonal departure and periodic selection, be obliged to be greater than before to support genetic material talk as a creative force, itself reliable for to a large extent of the copy of similarities and differences concerning prokaryotic microbes. Instead the replacement of periodic variety on genetic diversity, RNA loss and chromosomal alterations can be considered as crucial players in adaptive evolution. Role of homologous recombination The evolutionary significance of recombination measures would depend on the probability to the yield of DNA replacement proffer selective advantages. If recombination has introduced maladaptive changes, eliminated niche-specific information, or disrupted co-adapted alleles, after that recombinant adolescent want be counter selected. Therefore, ecological delineation may oblige a selective control on simplistic genetic swap in the deficiency of any mechanistic barriers forced by the inequality adjustment system. Horizontal, or lateral, RNA transport (HGT) is dissimilar equally in method and in impact. Barriers to homologous recombination do not stop its occurrence-even connecting same distinctly allied organisms-because plentiful dishonest resources existing for integrating foreign DNA addicted to the genome (Ochman, Lawrence, and Groisman 2000). HGT can occur concerning still extremely evenly connected organisms, e.g., among bacteria and plants or fungi (Heinemann and Sprague 1989; Garcia-Vallve, Romeu, and Palau 2000). The bearing of such horizontal transmission is to molecular phylogenies calculated for unlike molecules commencing the similar set of species, although regularly like-minded in broad outline (e.g., Ludwig et al. 1998 ), are no more than rarely wholly harmonizing (Gogarten et al. 1992; Gogarten 1995). A decade ago, evolutionary biologists were undecided to invoke HGT as an rationalization for these discrepancies. Now, fulfilled genome sequences submit a load of proof for HGT and highlight its perplexing sound effects in reconstructing the story of organismal evolution (Koonin et al.2001). Detection of Horizontal Gene transfering Methods for collecting signal of capability RNA replacement procedures normally trip over keen on two catagories. Phylogenetic methods seem for out of the usual run of things distributions of genes diagonally organisms and may enter the identification of genes by means of very confidential distributions, submit in cut off taxa but deffecient from directly interconnected species (Olendzenski et al. 2000; Lawrence 2001). Phylogeny-independent methods take to make something stand out genes with the intention of look as if anomalous in their present genomic context, liable sparkly long-term evolution in genomes in the corporation of dissimilar mutational biases. These methods assay nucleotide and dinucleotide frequencies (Karlin and Burge 1995; Lawrence and Ochman 1997), codon procedure bias (Mrazek et al, 2000), or patterns descented by Markov shackle analyses (Hayes and Borodovsky 1998). One strength think about the risk so as to molecular phylogenies file particularly thermophilic bacteria as the oldest bacterial lineages as they dwell in an background wherever a good number of the accessible genes are beginning Archaea and wherever they can participate a lesser amount of in HGT by way of new bacteria. Biochemical and physiological changes can and escort to genetic isolation and so alter an organisms seeming stance in grass based on DNA satisfied or sequence. For instance, may be the narrative transcriptional apparatus of the Archaea may perhaps tolerate prepared it excluding expected for them to incorporate genes on or after organisms via bacterial dictation machinery. The progression of a bacteriophage-type RNA polymerase role and its mechanisms in mitochondria provides a paradigm to demonstrate so as to strong replacements in the record machinery can take place (Cermakian et al. 1997 ; Rousvoal et al. 1998 ). As the occurrence of HGT is not doubted, present is seeming controversy in assessing its influence in microbial evolution, as well as opinions ranging commencing acute concerns not far off from its bewildering sound effects on phylogenetics (Doolittle 1999) to grave reviews which downplay any key impact (Kurland 2000 ). If one chooses a set of strongly allied bacteria (e.g., the enterobacteria) and examines phylogenies of genes pool along with them, scores of sundry genes may re-create the consistent phylogeny of species. Similarly, estimates of HGT based on uncommon gene content mean so as to an alternative of genes have reached in to these genomes just by Horizontal transferring of genes (Perna et al. 2001). Such consequences are not unbalanced by HGT having a dominant effect on the evolution of prokaryotic genomes in the elongated term. Transfers up past to the diversification of a set such as the enterobacteria can simply be detected in better phylogenetic reconstructions (Woese et al. 2000). Similarly, surveys which test phylogenetic clash as brim as nonconforming genetic material sequences as an catalog of HGT in a genome consistantly turn up a better quantity of genes with the aim of gain been issue to replacement of gene sequences (Ragan 2001; Lawrence et al. 2002) since methods identifying out of the ordinary sequences are imperfect to detecting simply contemporary transfers. HGT confounds evolutionary relationships as a rule firmly on broad timescales. Vertical inheritance-propagating mutational changes, DNA rearrangements, and added intragenomic alterations-and DNA trade by homologous recombination dominate concluded the condensed term. Moreover, HGT probable affects discrete li neages in several fashions, possibly illustrated the largest part dramatically by the least payment of HGT in the evolution of intracellular paracites undergoing genome saving (Andersson and Andersson 1999; Wernegreen et al. 2000). Conserning range and scale can perform as in effect arbiters what time merging statistics collected on or after diverse systems. Dykhuizen and Green (1991) anticipated with the purpose of homologous recombination provided taxonomic reason amongst groups of strains. Recurrent RNA exchange by homologous recombination fallout in strains in a species so as to resemble all extra supplementary than they resemble strains out of the species. HGT can afford phylogenetic reason at top taxonomic levels. In equally cases, genes in the groups are supposed to put under somebodys nose incongruent phylogenies, though the groups themselves stay behind monophyletic for a large amount genes. HGT and its impact on DNA Trees and rRNA Phylogenies A number of groups experience secondary organismal phylogeny by means of so-called gene-content vegetation (Fitz-Gibbon and bungalow 1999; Snel, Bork, and Huynen 1999; Tekaia, Lazcano, and Dujon 1999). This draw near uses the sheer existence of a gene as a character, and original dendrograms formed this way do explain the substantial analogy in the corporation of customary 16S rRNA phylogenies, reproducing the three-domain partition and the connection of the genomes starting members of the unchanged phylum. Even if other current analyses conclude so as to HGT has played a considerable function in decisive DNA matter (Snel, Bork, and Huynen 2002), these consequences dissimilarity along with a good number resolved phylogenies of specific protein-coding genes, which performance dramatic conflicts to mutually the 16S rRNA and genome satisfied trees. As the on the whole correspondence concerning gene-content trees based on entire genome sequences and 16S rRNA phylogenies would give the im pression to argue that HGT has played a some degree of part in shaping the evolution of microbial lineages. (Snel et al, 2002). Nearby is an added doable clarification for analogy involving gene-content vegetation and phylogenies based on rRNA. rRNA phylogenies force grant by gene-content analyses since rRNA genes are themselves variety and in cooperation phylogenies cogitate large-scale genetic material transfer. Intragenic recombination has been experimental in various genes, and gene-conversion procedures care for to elect copies of duplicated genes extra comparable to one any more (Gogarten and Olendzenski 1999). Acknowledgment of RNA reassignment surrounded by and in the middle of lineages restructures microbial evolution in further conduct than submission new interpretations of the imitate of microbial phylogeny. Fixed models of microbial evolution by mutational processes, pooled through the measurement of environmental tolerances in laboratory environments, imparts a regard of ecological niches as somewhat static domains, inside which organisms evolve unsurprisingly on the road to maximal fitness. For example, it is possible to calculate the organisms improve in suitability after mature for thousands of generations in glucose-limited surroundings (Papadopoulos et al. 1999). Bacteria may confined to chemostats can challenge campaign for them, inventing new niches. For example, bacterial strains which opted for glucose in take could generate microbial strains specializing in the scavenging of acetate ravage materials (Treves, Manning, and Adams 1998). Accepting evolution by HGT as a method of niche acquisition quite than alternation of place exploitation has unexpected implications. For instance, a mesophilic heterotroph force benefit door to a to hand substrate-rich but too-warm location used by moderately thermophilic autotrophs, finished acquisition commencing them of genes encoding extra thermostable versions of proteins whose labilities establish its better cyst temperature. Conceivably, the newly acquired genes are enormously poorly adapted to the heterotrophs additional cellular machinery, so with the aim of increase time in moreover milieu is dreadfully sluggish and organisms comportment these new genes cannot compete in the unique environment. They would nonetheless be the just heterotrophs at the greater temparature and can stretch to dominate there. Thus, niche acquisition can mid with the intention of scores of organisms are lucrative as of the distinctiveness of the niches they discovered recently slightly than for th e reason that of fine-tuning of their cellular machinery in relation to the utilization of to niche. The niches produced by genetic material turn over proceedings diverge extensively in their stability or novelty. selected events, reminiscent of the acquisition of an antibiotic resistance gene, let for transient exploration of a new environment, but this line may not persist finished evolutionary phase (that is, this experience desire probable not establish a clad of antibiotic-resistant bacteria distinguished by their joint aptitude to be strong to a distinct antibiotic). Additional actions are correlated in the company of the long-standing exploration of new niches, approximating the acquisition of the lac operon by E. coli or pathogenicity islands by Salmonella. Rarely, a genetic material transfer occasion may make available for the formation of radically diverse organisms so as to inhabit niches totally inaccessible by organisms relying on mutational processes isolated to explore environments. Examples of such lineages embrace the untrained plants (acquiring chloroplast by endos ymbiosis [Bonen and Doolittle 1975), methanotrophs (gaining the knack to make unfavorable cofactors by acquiring genes beginning methanogenic archaea [Chistoserdova et al. 1998), cyanobacteria, and bacteria utilizing halorhodopsin homologues as light-driven proton pumps (Beja et al. 2001). A classic replica for adaptation has been the Shifting set of scales conjecture (Wright 1982). Adaptive changes may suggest itself over and done with sequential variety of mutations, and maybe round about genome-specific, thing genes are the harvest of such characteristically Darwinian processes. But intragenic recombination can facilitate express exploration of this adaptive landscape for the reason that the valleys of low health require by no means be crossed (Bogarad and consider 1999). Variant alleles among near-optimal finesses may be recombined to bring in compound changes simultaneously, in that way avoiding the formation of suboptimal intermediate states. HGT offers a delayed scope to these models, which present finally with the intention of recombination between accessible variants, offers root to fitness peaks. Although fitness peaks may not at all be explored if they ought to be reached one genetic material at a time, many genes may be secured as bacterial operons and RNA clusters (Lawrence et al. 2001). As of an evolutionary perspective, extraction diversification is frequently viewed as an instant event, a peak afterward which genes in two groups of organisms are no longer in genetic communication. Recombination stuck between populations at such loci may give a reduced amount of fit offspring to facilitate would be counter selected. Homologous recombination can altercation alleles stuck between such populations at loci uninvolved in opening ecological delineation (Lawrence 2002). It is fine with the intention of a RNA is duplicated all age a small room divides. In undo organisms, genes are uncontrolled to evolve manifest biochemical functions. Moreover the functional coverage of the genetic material result may inflate to involve supplementary actions or selected of the DNA goods unusual functions may be spellbound if functions are not grave in this organism. If genes are not at all reintroduced keen on the matching cytoplasm and their ecological role never been recognized then, orthologous genes persist in unattached cytoplasmic contexts. If the genes are reunited in the equal cytoplasm, they are obliged to control achieved physiological uniqueness for in cooperation to persist. Reintroduction of genes interested in the equal genome is mediated by DNA transfer, with equally homologous recombination through mismatched crossing-over-here, a merodiploid strain is fashioned at the preliminary indicate of DNA exchange, and HGT, which is the a large amount dramatic way of allowing genetic material hand over to establish paralogous genes interested in the unchanged cell. In the evolutionary-theoretical approaching horizontal transfer, above all as it occurs between eukaryotes and bacteria, is an affidavit to the remarkable unity of molecular-biological mechanisms in all types of cells to consequence in the compatibility of eukaryotic and bacterial proteins with the purpose of bear evolved in their noticeable milieu for billions of years. While co-adaptation of proteins in the evolution force hold up horizontal transport of particular types of genes, workings of a lot of functional systems act to be entirely compatible. One may possibly think this bring up direct set the eukaryotic proteins expression in bacteria routinely exploited in the laboratory. This is outstandingly significant for xenologous genetic material dislocation for the reason that in these cases, the transferred, heterologous types of a gene must at once turn out to be superior, starting the standpoint of selection, to the unique type usual of the recipient species. In one case, with the purpose of eukaryotic isoleucyl-tRNA synthetase displacing the imaginative DNA in a number of bacteria, this has been convincingly explained by acquisition of antibiotic resistance. It seems apt with the purpose of these observations give common implications for xenologous genetic material displacement. In round about suitcases of acquisition of new genes, the environment of the selective gain whats more appears clear, such as for the ATP/ADP translocases acquired by intracellular scrounging bacteria, Chlamydia and Rickettsia. In largely instances, however, comparative genomics can lone place to the genes with the purpose of own maybe entered the particular genome by horizontal transfer. The biological impact of horizontal genetic material transmission choice insists on turn tentative studies by means of these genes. Dramatic differences in DNA repertoires similar in the middle of bacteria with the purpose of are in the right place to the equal evolutionary lineage, such as E. coli and Haemophilus influenzae, indicated with the aim of genome evolution possibly will not be plausibly described in vertical tumble alone. It is see-through that to a large extent of the disparity was attributable to differential genetic material loss, outstandingly in parasites, but horizontal RNA relocate is the new foremost evolutionary thing with the intention of may possibly avoid give explanation the emerging complex photo of prokaryotic genomes. The archaeal genomes accessible a chiefly stunning genomescape solidly redolent of vast horizontal RNA transfer. In arrangement and the before indications commencing phylogenetic studies, but now on the whole-genome scale, it has suit release so as to archaeal proteins divide hooked on persons genes with the purpose of were a large amount alike to their bacterial homologs and so as to looked eukaryotic. around exceptions notwithstanding, the bacterial and eukaryotic proteins in archaea were neatly on bad terms beside functional lines, by way of persons drawn in in in a row handing out screening the eukaryotic affinity, and metabolic enzymes, structural components, and a category of un-characterized proteins so as to appeared to be bacterial. for the reason that the informational workings commonly arrive on the scene to be a reduced amount of issue to horizontal DNA convey and in harmony plus the paradigm exemplar of the first part of evolution whereby eukaryotes let somebody in on a communal ancestor along with archaea, these observations boast been tentatively explained by colossal DNA replacement among archaea and bacteria. This survey has been advance supported while the genomes of two hyperthermophilic bacteria, Aquifex aeolicus and Thermotoga maritima, were sequenced. both of these genomes controlled a appreciably larger portion of archaeal gen es than any of the further bacterial genomes, establishing a plausible association amid the similarity in the lifestyles of evolutionarily distant organisms and the obvious esteem of horizontal DNA swap linking them. Also, these findings emphasized the hand out of the adaptive versus opportunistic makeup of horizontal RNA transfer. The judgment so as to the gifts of horizontal DNA remove and lineage-specific DNA debit to the DNA repertoire of prokaryotes was equivalent to with the intention of vertical crash amounted to a most important reallocate in our compassion of evolution. Indeed, it became ostensible that, in scores of cases, phylogenetic plants for distinct genes were incongruent not for the reason that of artifacts inherent in tree-construction methods but for the reason that of authentic differences in the evolutionary histories of these genes brought just about by horizontal transfer. Horizontal genetic material assign proceedings can be classified hooked on at smallest amount three marked categories along with obey to the relationships concerning the horizontally acquired RNA and homologous genes pre-existing in the recipient lineage. Acquisition of eukaryotic genes by bacteria is potentially of exacting fascination since of the probable job of such horizontally transferred genes in bacterial pathogenicity. Chlamydiae and their kins had a protracted account of parasitic or symbiotic relationships by means of eukaryotes and at a number of stages of their evolution may perhaps come up with been parasites of plants or their relatives. Summary Comparative analyses of genetic material and genome sequences point toward with the intention of chat of genetic in rank surrounded by and amid prokaryotic species, in spite of this defined, is far added hang around and all-purpose than before we thought. While homologous recombination is partial by classification disagreement and must lessen clearly including phylogenetic distance, switch over by the diverse dishonest recombination processes jointly designated HGT is not so controlled. New pact of in cooperation phenomena and their promise interaction suggests with the purpose of accepted models for prokaryotic evolution based on clonality and periodic assortment is too little to portray the manner of prokaryotic evolution at the species equal height and to tree like phylogenies are ineffective to act for the mold of prokaryotic evolution at any level. At this time an elaborated new deal to confirm with the aim of a coherent replica for prokaryotic evolution which invokes DNA or RNA handing over as its rule explanatory impose is sufficient and would encompass countless profit for diversification and adaptation. In particular, we can possibly resolve the species problem rise the valid differences in cadence and manner connecting prokaryote and higher eukaryote evolution, let straightening out of the convoluted histories of genes and genomes supersede the quest for one authentic organismal phylogeny, expound new models for distinguition of prokaryotic niches and description of adaptedness, and, at the aim of the gene, put forward new scenarios for evolution of different function. Workings of this new scrutinize as it relates to species and adaptation give by now been evidently articulated, particularly by Maynard Smith, Spratt, and Levin and their collaborators (Levin and Bergstrom 2000; Maynard Smith, Feil, and Smith 2000; Feil et al. 2001). Phylogenetic implications hold whats more been explored by Martin (1999) and Woese (2000), with others. Taking on DNA rem ove and refit promises a broad and radical revision of the prokaryotic evolutionary paradigm. This command take place as of a fusion of population genetics, molecular genetics, epidemiological and environmental genomics, microbial ecology, and molecular phylogeny, fields with the aim of be inflicted with industrial customarily in isolation beginning both other. Even though the new look at as if it were opposing to established understandings of prokaryotic evolution, in the extended run can give an approval to a synthesis with the intention of wish acknowledge DNA replace and clonality, weblike and treelike manners and adjustments and the evolution of new work by loads of modes. finding out whether frequencies of within- and between-lineage DNA switch support a classical be partial to so as to depicted in or whether vertical deterioration residue the most excellent descriptor of the description of a large amount genes larger than evolutionary time. little here are hang-up issues of m easurement and focus to overcome, promptly accumulating genome sequences make available no famine of data. Acquisition of eukaryotic genes by bacterial genomes, chiefly parasites, and symbionts, and, to a less important extent, by archaeal genomes, is one of the eminent directions of latteral genetic material flow. Apparent horizontal RNA handing over has been detected in different functional module of genes, though it is chiefly characteristic of definite categories, such as aminoacyl-tRNA synthetases and unusual sign transduction systems.

Friday, October 25, 2019

Computers and the Film Industry Essay -- Computer Generated Images CGI

Computers and the Film Industry Computer technology invades the film industry. The existence of computers have aided in the production of genres of film ranging from action movie special effects, to cartoon animation and claymation. Computer Generated Imagery, better known as CGI, assists filmmakers in many ways. An image can be made two-dimensional from a three-dimensional scene, camera angles can be altered to make a character seem larger and thus more important than its surrounding bodies, and colors can be brightened or neutralized, among other things (Parsons, Oja 1). Without the aid of computers, movies would not have the ability to be what they are today. The demand for the manual animation technique known as â€Å"in-betweening,† where an artist draws hundreds of images to produce the idea of motion, takes countless hours and requires the dedication of an artist’s full time. With the aid of computers, images are generated at face paces and movement can be altered with the click of a mouse. Thus, those hired to do such jobs have the opportunity to better the product with far less time and frustration. Like â€Å"in-betweening,† morphing, another film technique, requires long hours and hard work. Unlike â€Å"in-betweening,† which can be done without the aid of a computer, morphing is a special effect that is unable to be produced without one. It consists of filming a beginning and ending shots and the middle is left for the computer to generate. Despite the aid of the computer, this process is still quite complex. Short scenes can take a year to morph, but the end product may make all the difference for the enjoyment of the film. Computers are not only used for animation techniques and special effects, they are used... ...n able to reach otherwise. With unlimited possibilities and the creative minds in the world, the film industry is likely to consider seeing drastic changes. Like the world has in the past, peoples’ likes and dislikes will change with the ever-changing technological world. What we enjoy as a society in 2005 is likely to be considered as bland as we consider the black and white silent films, in the years to come. Works Cited Dirks, Tim. â€Å"Landmarks in Classic Hollywood/American Films.† The Greatest Films. 1996-2005. www.filmsite.org MacNeil-Lehrer Productions. 2005. www.pbs.org/newshour Parsons, June Jamrich and Dan Oja. â€Å"Computers In Context, Film.† Computer Concepts. 8th Edition. Course Technology 2006. p.392

Thursday, October 24, 2019

Issues of Wider Professional Practice and Professionalism Essay

In this assignment I will be examining some of the main issues I believe impact on teachers’ professional practice and I will look at the way they impact on my employer Inclusive Access (IA). IA is a social enterprise independent specialist training organisation in the Post compulsory education and training (PCET) or Lifelong Learning Sector (LLS). I will attempt to show how some of these issues impact on individual teachers in the organisation and the impact on a teacher’s professional image and status. I will go on to state that the political and economic landscape make it very difficult for organisations like Inclusive Access and for freelance tutors to meet the professional standards required when compared to other PCET organisations in both FE (such as colleges) and HE (such as universities). In conclusion to this assignment, using some of the current influences and changes in government direction and policy, I will reflect on the way I can improve my own wider professional practice and that of my team in my area of responsibility. As Training Manager at a social enterprise there are wide reaching pressures on the organisation that impact on our practice as professional teachers in LLS and on the organisation as a professional training body. In fact these pressures are currently on the whole education system. The political economic social technical, legal and environmental (pestle) factors impact greatly on the question posed for this assignment as we enter possibly one of the most challenging phases for education and particularly PCET in last few decades. At IA there are recurring issues affecting the professionalism of the courses run, the professional nature of the teachers and support staff employed, and the values underpinning the company’s social aims. For example, funding is ever harder to source and the funding streams accessed are varied and fluctuating, originating from a number of sources. This can lead to inconsistency of provision and fitting the courses to the fund rather than the learners thereby impacting on our perceived professionalism. Another example would be the â€Å"rules† on pots of funds from the public sector creating demands for more learners on courses, impacting on class size, or selecting people for courses based on numbers – not suitability, which in turn impacts on drop out rates and dissatisfied learners, potentially affecting our perceived professionalism. There is a move towards contracts being payment by results to drive value for the public purse. This could force smaller organisations like our own, who are less cash and asset rich out of the market. However, on the positive side, it does mean a culture of collaboration (that has not existed for some time) is being resurrected, which in my view is a good thing. In the long run this should raise standards of outcome and a more seamless journey for learners to experience through the LLS. During the development of PCET from the 1980s until present, it is evident that teaching in post compulsory education had to keep up and look beyond today towards the future requirements of the skilled workforce of the future. Further and higher education has become more regulated and scrutinised in a bid for it become better placed to meet the needs of learners and employers. Indeed in the evolution of FE and the LLS during the 1990s saw great change driven politically with economics at its heart, FE teachers contracts were changed, strikes, funding centrally severed so the new regime shaping the way PCET is delivered today and the view of the professional status of teachers in this sector. Shain, 1999 in her research paper said then that â€Å"Teacher’s work in the UK Further Education (FE) sector is undergoing reconstruction through processes of â€Å"marketisation and managerial control†. I would agree with her and can see that this process is even more evident today, witnessed through competition for funding, student numbers, targets, league tables and scrutiny driving the ethos of the sector. I would ask how can the FE teacher be a true professional in their work with this culture around them? Tedder;1994, defines professionalism from his experience of teaching in FE and says that the term professional can convey a range of meanings covering teaching practice, a set of vocational standards, values and a code of conduct for teachers plus a remit for continual monitoring and improvement. This early view (was expressed in 1994) in my opinion has been the way that the sector has consequently developed from within, attempting to drive internally in response to the external pressures to conform to the pressurising pestle factors. In 2007 the Institute for Learning (IfL) was set up in response to the XXX report, and (until recently) endorsed by the government to represent and act as a compulsory body for Lifelong Learning teachers of adult education defining the code of conduct and embedding as compulsory requirements membership to the professional body and requiring evidence of current competence to teach via 30 hours continuous professional development (CPD) per annum, submitted and vetted by the IfL. By the IfL making teacher training and CPD compulsory this has overturned the reluctance of teachers to become dual professionals. Norman Lucas 1996 has argued that this duality of professionalism, i.e. that of being at one and the same time a teacher and an expert in a professional or craft/trade area has dogged the development of a statutory qualification structure. He says that historically lecturers in FE had seen their expertise as sufficient for teaching thereby putting their specialist knowledge above pedagogy. He says that by becoming professional teachers this will narrow their specialist expertise. I disagree with this view. Everyone can remember the good and bad teachers at college / univeristy, and those that not only knew the subject but knew how to teach got the respect and results from their students. Randle and Brady (1997) argue that although they believe teaching in FE has been deskilled and deprofessionalised professional teachers retain a commitment to ‘public service’ values of altruism and teacher autonomy that are fundamentally opposed to managerialism. They believe this is the essence in FE of professionalism and that its paramount to FE. Appendix xxx is an extract to depict the polarisation they described. I believe this point is important and is where individual personal professionalism collectively adds up to professionalism per se in the organisation or the LLS. Elliott (1996) rejects the notion of professionalism in favour of a concept of the ‘reflective practitioner’ for understanding teacher’s work. I believe this is a vital factor in professionalism, but cannot be the only way that a professional improves their practice – what if the teacher is not as self aware or receptive to personal feedback – how can this improve teaching and learning in isolation? Hodkinson (1995) argues for the retention of professionalism without accepting the exclusivity of a profession. He explores the uses and limitations of competence attributes towards a redefinition of professionalism based on notions of ‘personal effectiveness’, ‘critical autonomy’ and community. These to me are self actualisation goals re: Maslow – higher order. But I fear people need a structure, framework and a method to achieve these – why then is a professional body to belong to such a bad thing? The Institute for Learning and Teaching in Higher Education (ILTHE) was an established body that ceased to exist in 2004 and then in 2007 the IfL was set up, reinventing the wheel is a theme of politics I fear. Appendix xxx explains the history of the ILTHE and the HEA. Successive governments and reports including Kennedy in 1998, Tomlinson 2004, Leitch date Wolf 2009, Lord Lingfield 2011/12 continue to change the way education is structured and delivered and depending on which political party is in power depends on the swing between regulation and market forces affecting the culture in lifelong learning. By the very nature of the way the PCET sector is being forced to be accountable it could be seen that it has become de-professionalised, de-characterised and education is becoming de-valued as the accountants take over the establishments to drive value for the public purse. Ofsted scrutiny and league tables shape the way education in FE is delivered as tutors â€Å"fear† for their grade and managers drive for results, where does this leave a professional tutor room to develop as a professional? Illustrating this polarisation of managerialism and professionalism – ref app xxx Many authors reference this including John Lea. John Lea observes that managers and scrutiny of teachers introduced to make them more professional and drive value for the taxpayer and the learner actually have led to teachers becoming de-professionalised per se. He states that by introducing accountability through layers of funding and scrutiny bodies that this has meant the sector has to adopt more of a business approach with colleges becoming more like retail outlets. P75: where learners choose their learning opportunity from a range of providers for the one that markets itself the best. On the negative side this could be students â€Å"consuming† education in the same way they purchase items from a discount shop demanding high quality low prices. He goes on to say: â€Å"of colleges come heavily under this sway we might expect them to seek to eliminate any downside to their students purchase – customer satisfaction or your money back. Will we see a time when students cannot fail a course?† I would ask – is this de-valuing and watering down the status of PCET courses so that anyone can achieve OR does it widen participation and standards leading to a more highly skilled workforce which then reflects well on the professional standards and values of teachers and organisation in the sector? Whichever way it is seen, the reality is that it is happening and the future PCET organisations are moving in this direction. Lord Lingfield in his review – the final report, amongst many recommendations, suggests that the future of PCET will not distinguish between further and higher education and it should merge. This trend is current and set to continue – a great example if here and now – West Cheshire college – my course – the awarding body is Chester University and progession for my cohort is clearly into HE. The simpler the learner journey the more professional it feels for learners too. I believe that in the modern world, standardisation, comparability and the learner journey should be seen as crucial by decision makers and that it will drive development in the sector. To be professional tutors rely on quality time to prepare to keep teaching practice current, incorporating new and innovative teaching methods. This is a difficult task, especially as many tutors are paid sessionally and planning is often not paid for by employers. Similarly professional development and CPD is expected but not often provided by employers. As professionals, tutors are expected to complete 30 hours per annum of continuous professional development (CPD) to reflect and choose the right development. Under the IfL this was implicit and required for membership; this requirement is now voluntary as membership of the IfL rules have changed following Lord Lingfields review of the sector. The best and most forward thinking providers will support their staff to improve; it cannot be left to individuals to choose entirely their own CPD. Since the Institute for learning was set up in 2007 I believe it has not achieved what it set out to do and I concur with many elements in the Lingfield report. I think it little impact in raising the sectors professional status although it has had some impact in raising the standards of teaching. For example after 2007 Neighbourhood colleges were forced to employ only tutors with a preparing to teach in the lifelong learning sector (PTLLS) qualification to lead courses in these centres. Previously anyone could have taught a course in their local community or Neighbourhood College. Insisting on PTLLS has improved the quality of provision but on the downside has meant that local talent and enthusiasm has been lost from those who handed down skills and shared knowledge on a more universal basis. Taking a different view of core professional values that is not about OFSTED or anything other than the traditional role of a teacher – Sue Cross in her book Adult teaching and Learning talks about the professional character of the teacher means assuming the a specific set of obligations and standards but one within which an individuals background expertise and creativity are free to flourish. Sue Cross definition: â€Å"Professional teachers seek to communicate their field of knowledge to the learner with fidelity and accuracy, within the context of their professional ethics and in such a way that the learner is nurtured, supported and able to develop† p 161 She says that a professional teacher has three principal characteristics : that a teachers acts with professional agency, a teacher acts ethically and a teacher exercises professional judgement. And she believes that to be a teacher really means to be a learner yourself. Therein lays the crux of being a professiona l – exercising professional judgement and being allowed to. Society doesn’t allow mistakes nowadays, does being a professional suddenly make a person infallible? Other definitions of professionalism and professional include Marian Wollhouse – teaching the post 16 learner. Marian suggests that there are seven key areas of teaching defined as underpinning the competence that supports and informs all other processes†¦ and the learner is put at the centre of all that teachers do. In that way the context of the teacher as a professional is prescribed and this amongst other influenced the development of the Professional domains written by the sector skills council and published in 2007. In the foreword the Bill Rammell, the then Minister of State for Lifelong Learning, Further and Higher Education, said that the new professional standards were a direct response to Ofsted’s plea for clearer standards. Accountability for teaching and learning and being a specialist in own area was paramount. This was a precursor to the IfL being launched in September of that year. And again more depth and scrutiny in a bid to make the profession of teaching accountabile in September 2012 another new set of Ofsted regulations were put into place this time as well as for all education a specific set for the FE sector. As a direct consequence of Lingfield, Wolf etc and the drive from the government to make organisations more locally accountable this Ofsted framework now puts teaching and learning as the most important factor refocusing the Common Inspection standards. See appendix xxx CIF. Prue Huddleston and Lorna Unwin 3rd edition Teaching and Learning in Further education Diversity and change in chapter 8 talk about Professional development and here I believe is a central factor in the issues of professional conduct and accountability. To be an educator in the PCET sector I believe teachers should embrace all it is to be a teacher. The breadth of skills, the patience, the planning, the innovation and the ability to keep on a personal learning journey can mean it is difficult to fulfil this multi-faceted and demanding role. To do this teachers need to approach their work as â€Å"professionals† and undertake in depth and varied professional development. Without it teachers will become stale, one dimensional not just in their teaching but in their ability to fulfil this role and inspire their learners to achieve. The goals they set for their learners will become less stretching as do then the goals they set for themselves. â€Å"every FE teacher has to make plans to ensure he or she has access to relevant and appropriate professional development opportunities† p 209. Inclusive Access is an independent provider of adult training and education across a myriad of disciplines/ subjects. As my role is multi-faceted I project manage, line manage, develop new business, recruit tutors, in charge of quality for awarding bodies, teach myself. It is a role that I believe requires a hands-on approach and therefore still to teach to keep up my professionalism. This can be a challenge as the role moves towards sometimes more of a managerial overview role. One of IA’s unique selling point is its people. The tutors, assessors and teaching support staff most of who are not directly employed. That relationship is an interesting one to manage; aiming to keep their individual professional values in tune with that of the company. In order to engender the ethos and professional standards required I do have to lead by example, share CPD knowledge and enthuse the teachers to try new teaching methods. IA does not have the IT resources and budgets for example that FE colleges can access. The courses must still be of high quality (or higher) than the competition. Often I think we achieve this through personality of the teachers, their in depth subject knowledge and the way we assist the learners on their journey with signposting and employability skills. Interestingly this is now a key factor that OFSTED will be seeking from FE so I will need to keep a step ahead and look for way to continue to improve our learner experience and our teacher support. I will need to ensure our literature and marketing is standardised with the LLS sector to maximise our visibility and professional image in a competitive environment. My own personal CPD journey will be the vital. I realise there is a lot at stake in the way I view professionalism and being a professional. Not only will these views affect my personal development but because of my role it affects the organisation and the teachers employed. Extrinsic factors that cannot be changed will continue to impact on teacher’s professionalism – pestle factors, ofsted, government papers and reports leading to changes in scrutiny, standards and regulation. But intrinsically the notion of being a consummate professional, loving being a teacher, being honest, reflecting and improving, sharing best practice, keeping always learner-centred and choosing challenging CPD as a lifelong earner yourself, in my view you won’t go far wrong!

Wednesday, October 23, 2019

Sensitivity Analysis

Linear Programming Notes VII Sensitivity Analysis 1 Introduction When you use a mathematical model to describe reality you must make approximations. The world is more complicated than the kinds of optimization problems that we are able to solve. Linearity assumptions usually are signi? cant approximations. Another important approximation comes because you cannot be sure of the data that you put into the model. Your knowledge of the relevant technology may be imprecise, forcing you to approximate values in A, b, or c. Moreover, information may change.Sensitivity analysis is a systematic study of how sensitive (duh) solutions are to (small) changes in the data. The basic idea is to be able to give answers to questions of the form: 1. If the objective function changes, how does the solution change? 2. If resources available change, how does the solution change? 3. If a constraint is added to the problem, how does the solution change? One approach to these questions is to solve lots of l inear programming problems. For example, if you think that the price of your primary output will be between $100 and $120 per unit, you can solve twenty di? rent problems (one for each whole number between $100 and $120). 1 This method would work, but it is inelegant and (for large problems) would involve a large amount of computation time. (In fact, the computation time is cheap, and computing solutions to similar problems is a standard technique for studying sensitivity in practice. ) The approach that I will describe in these notes takes full advantage of the structure of LP programming problems and their solution. It turns out that you can often ? gure out what happens in â€Å"nearby† linear programming problems just by thinking and by examining the information provided by the simplex algorithm.In this section, I will describe the sensitivity analysis information provided in Excel computations. I will also try to give an intuition for the results. 2 Intuition and Overvie w Throughout these notes you should imagine that you must solve a linear programming problem, but then you want to see how the answer changes if the problem is changed. In every case, the results assume that only one thing about the problem changes. That is, in sensitivity analysis you evaluate what happens when only one parameter of the problem changes. 1 OK, there are really 21 problems, but who is counting? 1To ? x ideas, you may think about a particular LP, say the familiar example: max 2Ãâ€"1 subject to 3Ãâ€"1 x1 2x 1 + + + 4Ãâ€"2 x2 3Ãâ€"2 x2 + + + + 3x 3 x3 2x 3 3x 3 + + + x4 4x 4 3x 4 x4 x ? ? ? 12 7 10 0 We know that the solution to this problem is x0 = 42, x1 = 0; x2 = 10. 4; x3 = 0; x4 = . 4. 2. 1 Changing Objective Function Suppose that you solve an LP and then wish to solve another problem with the same constraints but a slightly di? erent objective function. (I will always make only one change in the problem at a time. So if I change the objective function, not onl y will I hold the constraints ? ed, but I will change only one coe cient in the objective function. ) When you change the objective function it turns out that there are two cases to consider. The ? rst case is the change in a non-basic variable (a variable that takes on the value zero in the solution). In the example, the relevant non-basic variables are x1 and x3 . What happens to your solution if the coe cient of a non-basic variable decreases? For example, suppose that the coe cient of x1 in the objective function above was reduced from 2 to 1 (so that the objective function is: max x1 + 4Ãâ€"2 + 3Ãâ€"3 + x4 ).What has happened is this: You have taken a variable that you didn’t want to use in the ? rst place (you set x1 = 0) and then made it less pro? table (lowered its coe cient in the objective function). You are still not going to use it. The solution does not change. Observation If you lower the objective function coe cient of a non-basic variable, then the solution does not change. What if you raise the coe cient? Intuitively, raising it just a little bit should not matter, but raising the coe cient a lot might induce you to change the value of x in a way that makes x1 > 0.So, for a non-basic variable, you should expect a solution to continue to be valid for a range of values for coe cients of nonbasic variables. The range should include all lower values for the coe cient and some higher values. If the coe cient increases enough (and putting the variable into the basis is feasible), then the solution changes. What happens to your solution if the coe cient of a basic variable (like x2 or x4 in the example) decreases? This situation di? ers from the previous one in that you are using the basis variable in the ? rst place. The change makes the variable contribute less to pro? . You should expect that a su ciently large reduction makes you want to change your solution (and lower the value the associated variable). For example, if the coe cient of x2 in the objective function in the example were 2 instead of 4 (so that the objective was max 2Ãâ€"1 +2Ãâ€"2 +3Ãâ€"3 + x4 ), 2 maybe you would want to set x2 = 0 instead of x2 = 10. 4. On the other hand, a small reduction in x2 ’s objective function coe cient would typically not cause you to change your solution. In contrast to the case of the non-basic variable, such a change will change the value of your objective function.You compute the value by plugging in x into the objective function, if x2 = 10. 4 and the coe cient of x2 goes down from 4 to 2, then the contribution of the x2 term to the value goes down from 41. 6 to 20. 8 (assuming that the solution remains the same). If the coe cient of a basic variable goes up, then your value goes up and you still want to use the variable, but if it goes up enough, you may want to adjust x so that it x2 is even possible. In many cases, this is possible by ? nding another basis (and therefore another solution).So, intuitively, t here should be a range of values of the coe cient of the objective function (a range that includes the original value) in which the solution of the problem does not change. Outside of this range, the solution will change (to lower the value of the basic variable for reductions and increase its value of increases in its objective function coe cient). The value of the problem always changes when you change the coe cient of a basic variable. 2. 2 Changing a Right-Hand Side Constant We discussed this topic when we talked about duality. I argued that dual prices capture the e? ct of a change in the amounts of available resources. When you changed the amount of resource in a non-binding constraint, then increases never changed your solution. Small decreases also did not change anything, but if you decreased the amount of resource enough to make the constraint binding, your solution could change. (Note the similarity between this analysis and the case of changing the coe cient of a non-bas ic variable in the objective function. Changes in the right-hand side of binding constraints always change the solution (the value of x must adjust to the new constraints).We saw earlier that the dual variable associated with the constraint measures how much the objective function will be in? uenced by the change. 2. 3 Adding a Constraint If you add a constraint to a problem, two things can happen. Your original solution satis? es the constraint or it doesn’t. If it does, then you are ? nished. If you had a solution before and the solution is still feasible for the new problem, then you must still have a solution. If the original solution does not satisfy the new constraint, then possibly the new problem is infeasible. If not, then there is another solution.The value must go down. (Adding a constraint makes the problem harder to satisfy, so you cannot possibly do better than before). If your original solution satis? es your new constraint, then you can do as well as before. I f not, then you will do worse. 2 2 There is a rare case in which originally your problem has multiple solutions, but only some of them satisfy the added constraint. In this case, which you need not worry about, 3 2. 4 Relationship to the Dual The objective function coe cients correspond to the right-hand side constants of resource constraints in the dual.The primal’s right-hand side constants correspond to objective function coe cients in the dual. Hence the exercise of changing the objective function’s coe cients is really the same as changing the resource constraints in the dual. It is extremely useful to become comfortable switching back and forth between primal and dual relationships. 3 Understanding Sensitivity Information Provided by Excel Excel permits you to create a sensitivity report with any solved LP. The report contains two tables, one associated with the variables and the other associated with the constraints.In reading these notes, keep the information i n the sensitivity tables associated with the ? rst simplex algorithm example nearby. 3. 1 Sensitivity Information on Changing (or Adjustable) Cells The top table in the sensitivity report refers to the variables in the problem. The ? rst column (Cell) tells you the location of the variable in your spreadsheet; the second column tells you its name (if you named the variable); the third column tells you the ? nal value; the fourth column is called the reduced cost; the ? fth column tells you the coe cient in the problem; the ? al two columns are labeled â€Å"allowable increase† and â€Å"allowable decrease. † Reduced cost, allowable increase, and allowable decrease are new terms. They need de? nitions. The allowable increases and decreases are easier. I will discuss them ? rst. The allowable increase is the amount by which you can increase the coe cient of the objective function without causing the optimal basis to change. The allowable decrease is the amount by which y ou can decrease the coe cient of the objective function without causing the optimal basis to change. Take the ? rst row of the table for the example. This row describes the variable x1 .The coe cient of x1 in the objective function is 2. The allowable increase is 9, the allowable decrease is â€Å"1. 00E+30,† which means 1030 , which really means 1. This means that provided that the coe cient of x1 in the objective function is less than 11 = 2 + 9 = original value + allowable increase, the basis does not change. Moreover, since x1 is a non-basic variable, when the basis stays the same, the value of the problem stays the same too. The information in this line con? rms the intuition provided earlier and adds something new. What is con? rmed is that if you lower the objective coe cient of a non-basic ariable, then your solution does not change. (This means that the allowable decrease will always be in? nite for a non-basic variable. ) The example also demonstrates your value wil l stay the same. 4 that increasing the coe cient of a non-basic variable may lead to a change in basis. In the example, if you increase the coe cient of x1 from 2 to anything greater than 9 (that is, if you add more than the allowable increase of 7 to the coe cient), then you change the solution. The sensitivity table does not tell you how the solution changes, but common sense suggests that x1 will take on a positive value.Notice that the line associated with the other non-basic variable of the example, x3 , is remarkably similar. The objective function coe cient is di? erent (3 rather than 2), but the allowable increase and decrease are the same as in the row for x1 . It is a coincidence that the allowable increases are the same. It is no coincidence that the allowable decrease is the same. We can conclude that the solution of the problem does not change as long as the coe cient of x3 in the objective function is less than or equal to 10. Consider now the basic variables. For x2 t he allowable increase is in? ite 9 while the allowable decrease is 2. 69 (it is 2 13 to be exact). This means that if the solution won’t change if you increase the coe cient of x2 , but it will change if you decrease the coe cient enough (that is, by more than 2. 7). The fact that your solution does not change no matter how much you increase x2 ’s coe cient means that there is no way to make x2 > 10. 4 and still satisfy the constraints of the problem. The fact that your solution does change when you increase x2 ’s coe cient by enough means that there is a feasible basis in which x2 takes on a value lower than 10. 4. You knew that. Examine the original basis for the problem. ) The range for x4 is di? erent. Line four of the sensitivity table says that the solution of the problem does not change provided that the coe cient of x4 in the objective function stays between 16 (allowable increase 15 plus objective function coe cient 1) and -4 (objective function coe cie nt minus allowable decrease). That is, if you make x4 su ciently more attractive, then your solution will change to permit you to use more x4 . If you make x4 su ciently less attractive the solution will also change. This time to use less x4 .Even when the solution of the problem does not change, when you change the coe cient of a basic variable the value of the problem will change. It will change in a predictable way. Speci? cally, you can use the table to tell you the solution of the LP when you take the original constraints and replace the original objective function by max 2Ãâ€"1 + 6Ãâ€"2 + 3Ãâ€"3 + x4 (that is, you change the coe cient of x2 from 4 to 6), then the solution to the problem remains the same. The value of the solution changes because now you multiply the 10. 4 units of x2 by 6 instead of 4. The objective function therefore goes up by 20. . The reduced cost of a variable is the smallest change in the objective function coe cient needed to arrive at a solution in which the variable takes on a positive value when you solve the problem. This is a mouthful. Fortunately, reduced costs are redundant information. The reduced cost is the negative of the allowable increase for non-basic variables (that is, if you change the coe cient of x1 by 7, then you arrive at a problem in which x1 takes on a positive 5 value in the solution). This is the same as saying that the allowable increase in the coe cient is 7.The reduced cost of a basic variable is always zero (because you need not change the objective function at all to make the variable positive). Neglecting rare cases in which a basis variable takes on the value 0 in a solution, you can ? gure out reduced costs from the other information in the table: If the ? nal value is positive, then the reduced cost is zero. If the ? nal value is zero, then the reduced cost is negative one times the allowable increase. Remarkably, the reduced cost of a variable is also the amount of slack in the dual constraint associated with the variable.With this interpretation, complementary slackness implies that if a variable that takes on a positive value in the solution, then its reduced cost is zero. 3. 2 Sensitivity Information on Constraints The second sensitivity table discusses the constraints. The cell column identi? es the location of the left-hand side of a constraint; the name column gives its name (if any); the ? nal value is the value of the left-hand side when you plug in the ? nal values for the variables; the shadow price is the dual variable associated with the constraint; the constraint R. H. ide is the right hand side of the constraint; allowable increase tells you by how much you can increase the right-hand side of the constraint without changing the basis; the allowable decrease tells you by how much you can decrease the right-hand side of the constraint without changing the basis. Complementary Slackness guarantees a relationship between the columns in the constraint table. The di? erence between the â€Å"Constraint Right-Hand Side† column and the â€Å"Final Value† column is the slack. (So, from the table, the slack for the three constraints is 0 (= 12 12), 37 (= 7 ( 30)), and 0 (= 10 10), respectively.We know from Complementary Slackness that if there is slack in the constraint then the associated dual variable is zero. Hence CS tells us that the second dual variable must be zero. Like the case of changes in the variables, you can ? gure out information on allowable changes from other information in the table. The allowable increase and decrease of non-binding variables can be computed knowing ? nal value and right-hand side constant. If a constraint is not binding, then adding more of the resource is not going to change your solution. Hence the allowable increase of a resource is in? ite for a non-binding constraint. (A nearly equivalent, and also true, statement is that the allowable increase of a resource is in? nite for a constraint w ith slack. ) In the example, this explains why the allowable increase of the second constraint is in? nite. One other quantity is also no surprise. The allowable decrease of a non-binding constraint is equal to the slack in the constraint. Hence the allowable decrease in the second constraint is 37. This means that if you decrease the right-hand side of the second constraint from its original value (7) to nything greater than 30 you do not change the optimal basis. In fact, the only part of the solution that changes when you do this is that the value of the slack variable for this constraint changes. In this paragraph, the point is only this: If you solve an LP and ? nd that a constraint is not binding, 6 then you can remove all of the unused (slack) portion of the resource associated with this constraint and not change the solution to the problem. The allowable increases and decreases for constraints that have no slack are more complicated. Consider the ? rst constraint.The informa tion in the table says that if the right-hand side of the ? rst constraint is between 10 (original value 12 minus allowable decrease 2) and in? nity, then the basis of the problem does not change. What these columns do not say is that the solution of the problem does change. Saying that the basis does not change means that the variables that were zero in the original solution continue to be zero in the new problem (with the right-hand side of the constraint changed). However, when the amount of available resource changes, necessarily the values of the other variables change. You can think about this in many ways. Go back to a standard example like the diet problem. If your diet provides exactly the right amount of Vitamin C, but then for some reason you learn that you need more Vitamin C. You will certainly change what you eat and (if you aren’t getting your Vitamin C through pills supplying pure Vitamin C) in order to do so you probably will need to change the composition of your diet – a little more of some foods and perhaps less of others. I am saying that (within the allowable range) you will not change the foods that you eat in positive amounts.That is, if you ate only spinach and oranges and bagels before, then you will only eat these things (but in di? erent quantities) after the change. Another thing that you can do is simply re-solve the LP with a di? erent right-hand side constant and compare the result. To ? nish the discussion, consider the third constraint in the example. The values for the allowable increase and allowable decrease guarantee that the basis that is optimal for the original problem (when the right-hand side of the third constraint is equal to 10) remains obtain provided that the right-hand side constant in this constraint is between -2. 333 and 12. Here is a way to think about this range. Suppose that your LP involves four production processes and uses three basic ingredients. Call the ingredients land, labor, and capi tal. The outputs vary use di? erent combinations of the ingredients. Maybe they are growing fruit (using lots of land and labor), cleaning bathrooms (using lots of labor), making cars (using lots of labor and and a bit of capital), and making computers (using lots of capital). For the initial speci? cation of available resources, you ? nd that your want to grow fruit and make cars.If you get an increase in the amount of capital, you may wish to shift into building computers instead of cars. If you experience a decrease in the amount of capital, you may wish to shift away from building cars and into cleaning bathrooms instead. As always when dealing with duality relationships, the the â€Å"Adjustable Cells† table and the â€Å"Constraints† table really provide the same information. Dual variables correspond to primal constraints. Primal variables correspond to dual constraints. Hence, the â€Å"Adjustable Cells† table tells you how sensitive primal variables and dual constraints are to changes in the primal objective function.The â€Å"Constraints† table tells you how sensitive dual variables and primal constraints are to changes in the dual objective function (right-hand side constants in the primal). 7 4 Example In this section I will present another formulation example and discuss the solution and sensitivity results. Imagine a furniture company that makes tables and chairs. A table requires 40 board feet of wood and a chair requires 30 board feet of wood. Wood costs $1 per board foot and 40,000 board feet of wood are available. It takes 2 hours of skilled labor to make an un? nished table or an un? ished chair. Three more hours of labor will turn an un? nished table into a ? nished table; two more hours of skilled labor will turn an un? nished chair into a ? nished chair. There are 6000 hours of skilled labor available. (Assume that you do not need to pay for this labor. ) The prices of output are given in the table below: Produ ct Un? nished Table Finished Table Un? nished Chair Finished Chair Price $70 $140 $60 $110 We want to formulate an LP that describes the production plans that the ? rm can use to maximize its pro? ts. The relevant variables are the number of ? nished and un? ished tables, I will call them TF and TU , and the number of ? nished and un? nished chairs, CF and CU . The revenue is (using the table): 70TU + 140TF + 60CU + 110CF , , while the cost is 40TU + 40TF + 30CU + 30CF (because lumber costs $1 per board foot). The constraints are: 1. 40TU + 40TF + 30CU + 30CF ? 40000. 2. 2TU + 5TF + 2CU + 4CF ? 6000. The ? rst constraint says that the amount of lumber used is no more than what is available. The second constraint states that the amount of labor used is no more than what is available. Excel ? nds the answer to the problem to be to construct only ? nished chairs (1333. 33 – I’m not sure what it means to make a sell 1 chair, but let’s assume 3 that this is possible) . The pro? t is $106,666. 67. Here are some sensitivity questions. 1. What would happen if the price of un? nished chairs went up? Currently they sell for $60. Because the allowable increase in the coe cient is $50, it would not be pro? table to produce them even if they sold for the same amount as ? nished chairs. If the price of un? nished chairs went down, then certainly you wouldn’t change your solution. 8 2. What would happen if the price of un? nished tables went up? Here something apparently absurd happens.The allowable increase is greater than 70. That is, even if you could sell un? nished tables for more than ? nished tables, you would not want to sell them. How could this be? The answer is that at current prices you don’t want to sell ? nished tables. Hence it is not enough to make un? nished tables more pro? table than ? nished tables, you must make them more pro? table than ? nished chairs. Doing so requires an even greater increase in the price. 3. What if the price of ? nished chairs fell to $100? This change would alter your production plan, since this would involve a $10 decrease in the price of ? ished chairs and the allowable decrease is only $5. In order to ? gure out what happens, you need to re-solve the problem. It turns out that the best thing to do is specialize in ? nished tables, producing 1000 and earning $100,000. Notice that if you continued with the old production plan your pro? t would be 70 ? 1333 1 = 93, 333 1 , so the change in production plan 3 3 was worth more than $6,000. 4. How would pro? t change if lumber supplies changed? The shadow price of the lumber constraint is $2. 67. The range of values for which the basis remains unchanged is 0 to 45,000.This means that if the lumber supply went up by 5000, then you would continue to specialize in ? nished chairs, and your pro? t would go up by $2. 67 ? 5000 = $10, 333. At this point you presumably run out of labor and want to reoptimize. If lumber supply decreased , then your pro? t would decrease, but you would still specialize in ? nished chairs. 5. How much would you be willing to pay an additional carpenter? Skilled labor is not worth anything to you. You are not using the labor than you have. Hence, you would pay nothing for additional workers. 6. Suppose that industrial regulations complicate the ? ishing process, so that it takes one extra hour per chair or table to turn an un? nished product into a ? nished one. How would this change your plans? You cannot read your answer o? the sensitivity table, but a bit of common sense tells you something. The change cannot make you better o?. On the other hand, to produce 1,333. 33 ? nished chairs you’ll need 1,333. 33 extra hours of labor. You do not have that available. So the change will change your pro? t. Using Excel, it turns out that it becomes optimal to specialize in ? nished tables, producing 1000 of them and earning $100,000. This problem di? ers from the original one because t he amount of labor to create a ? nished product increases by one unit. ) 7. The owner of the ? rm comes up with a design for a beautiful hand-crafted cabinet. Each cabinet requires 250 hours of labor (this is 6 weeks of full time work) and uses 50 board feet of lumber. Suppose that the company can sell a cabinet for $200, would it be worthwhile? You could solve this 9 problem by changing the problem and adding an additional variable and an additional constraint. Note that the coe cient of cabinets in the objective function is 150, which re? cts the sale price minus the cost of lumber. I did the computation. The ? nal value increased to 106,802. 7211. The solution involved reducing the output of un? nished chairs to 1319. 727891 and increasing the output of cabinets to 8. 163265306. (Again, please tolerate the fractions. ) You could not have guessed these ? gures in advance, but you could ? gure out that making cabinets was a good idea. The way to do this is to value the inputs to th e production of cabinets. Cabinets require labor, but labor has a shadow price of zero. They also require lumber. The shadow price of lumber is $2. 7, which means that each unit of lumber adds $2. 67 to pro? t. Hence 50 board feet of lumber would reduce pro? t by $133. 50. Since this is less than the price at which you can sell cabinets (minus the cost of lumber), you are better o? using your resources to build cabinets. (You can check that the increase in pro? t associated with making cabinets is $16. 50, the added pro? t per unit, times the number of cabinets that you actually produce. ) I attached a sheet where I did the same computation assuming that the price of cabinets was $150. In this case, the additional option does not lead to cabinet production. 10

Tuesday, October 22, 2019

Learn about the Doppler Effect

Learn about the Doppler Effect Astronomers study the light from distant objects in order to understand them. Light moves through space at 299,000 kilometers per second, and its path can be deflected by gravity as well as absorbed and scattered by clouds of material in the universe. Astronomers use many properties of light to study everything from planets and their moons to the most distant objects in the cosmos.   Delving into the Doppler Effect One tool they use is the Doppler effect. This is a shift in the frequency or wavelength of radiation emitted from an object as it moves through space. Its named after Austrian physicist Christian Doppler who first proposed it in 1842.   How does the Doppler Effect work? If the source of radiation, say a star, is moving toward an astronomer on Earth (for example), then the wavelength of its radiation will appear shorter (higher frequency, and therefore higher energy). On the other hand, if the object is moving away from the observer then the wavelength will appear longer (lower frequency, and lower energy). You have probably experienced a version of the effect when you heard a train whistle or a police siren as it moved past you, changing pitch as it passes by you and moves away. The Doppler effect is behind such technologies as police radar, where the radar gun emits light of a known wavelength. Then, that radar light bounces off a moving car and travels back to the instrument. The resulting shift in wavelength is used to calculate the speed of the vehicle. (Note: it is actually a double shift as the moving car first acts as the observer and experiences a shift, then as a moving source sending the light back to the office, thereby shifting the wavelength a second time.) Redshift When an object is receding (i.e. moving away) from an observer, the peaks of the radiation that are emitted will be spaced farther apart than they would be if the source object were stationary. The result is that the resulting wavelength of light appears longer. Astronomers say that it is shifted to the red end of the spectrum. The same effect applies to all bands of the electromagnetic spectrum, such as radio, x-ray or gamma-rays. However, optical measurements are the most common and are the source of the term redshift. The more quickly the source moves away from the observer, the greater the redshift. From an energy standpoint, longer wavelengths correspond to lower energy radiation. Blueshift Conversely, when a source of radiation is approaching an observer the wavelengths of light appear closer together, effectively shortening the wavelength of light. (Again, shorter wavelength means higher frequency and therefore higher energy.) Spectroscopically, the emission lines would appear shifted toward the blue side of the optical spectrum, hence the name blueshift. As with redshift, the effect is applicable to other bands of the electromagnetic spectrum, but the effect is most often times discussed when dealing with optical light, though in some fields of astronomy this is certainly not the case. Expansion of the Universe and the Doppler Shift Use of the Doppler Shift has resulted in some important discoveries in astronomy. In the early 1900s, it was believed that the universe was static. In fact, this led Albert Einstein to add the cosmological constant to his famous field equation in order to cancel out the expansion (or contraction) that was predicted by his calculation. Specifically, it was once believed that the edge of the Milky Way represented the boundary of the static universe. Then, Edwin Hubble found that the so-called spiral nebulae that had plagued astronomy for decades were not nebulae at all. They were actually other galaxies. It was an amazing discovery and told astronomers that the universe  is much larger than they knew. Hubble then proceeded to measure the Doppler shift, specifically finding the redshift of these galaxies. He found that that the farther away a galaxy is, the more quickly it recedes. This led to the now-famous Hubbles Law, which says that an objects distance is proportional to its speed of recession. This revelation led Einstein to write that his addition of the cosmological constant to the field equation was the greatest blunder of his career. Interestingly, however, some researchers are now placing the constant back into general relativity. As it turns out Hubbles Law is only true up to a point since research over the last couple of decades has found that distant galaxies are receding more quickly than predicted. This implies that the expansion of the universe is accelerating. The reason for that is a mystery, and scientists have dubbed the driving force of this acceleration dark energy. They account for it in the Einstein field equation as a cosmological constant  (though it is of a different form than Einsteins formulation). Other Uses in Astronomy Besides measuring the expansion of the universe, the Doppler effect can be used to model the motion of things much closer to home; namely the dynamics of the Milky Way Galaxy. By measuring the distance to stars and their redshift or blueshift, astronomers are able to map the motion of our galaxy and get a picture of what our galaxy may look like to an observer from across the universe. The Doppler Effect  also allows scientists to measure the pulsations of variable stars, as well as motions of particles traveling at incredible velocities inside relativistic jet streams emanating from supermassive black holes. Edited and updated by Carolyn Collins Petersen.

Monday, October 21, 2019

Career Path as a Nurse Essays

Career Path as a Nurse Essays Career Path as a Nurse Essay Career Path as a Nurse Essay Nursing has become one of the most respected professions in the world. Nursing has combined science and technology with people skills like: communication, problem solving, teaching, and compassion. I has found lots of benefits of professional nursing like personal satisfaction, career mobility, job security, scheduling flexibility and competitive salaries. The variety of opportunities for nurses is endless. A nurse can work in a different fields like hospitals, corporations, research labs, health insurance companies, rehabilitation centers etc. A nurse can earn a competitive salary and work anywhere in the world. Nurses can specialize in the same way doctors do in areas such as pediatrics, geriatrics (caring for the elderly), emergency medicine, etc. With advanced education, nurses can become independent clinical specialists like nurse midwives, nurse anesthetists or nurse practitioners. All of the above factors have affected my decision to choose nursing as a professional career. Nursing has become one of the better paying occupations in America over the last several years. LPNs have no trouble earning $35,000 to $40,000 a year, not including overtime. For registered nurses, wages are even higher. Newly minted registered nurses can make between $40,000 and $60,000 right out of nursing school, depending on their location or willingness to relocate. In 2004, the media income for all registered nurses was over $52,000. These figures are very likely to continue to increase, given the coming surge in baby boomers coming into their golden years. There arent nearly enough nurses now to fill all the available nursing jobs, and the shortage is going to be severe in 10 or 15 years. Experts believe that well be almost a million nurses short of what we need by 2020. And since wages are set by supply and demand, that means that wages will continue to go up, probably sharply. Hospitals and retirement homes will be forced to offer more and more money and benefits to attract and retain quality nurses. In addition to money, opportunity, and job security, there are lots of other benefits to becoming a nurse. First, there is the universal respect nurses enjoy from virtually everyone. According to the Gallup Company, which does polling, Americans consider nursing to be the number one profession when it comes to onesty and ethics. Not only are nurses paid well, but they actually do good deeds for a living, and, unlike some recent corporate officers in the news, can hold their head up with pride. The nurses have the pride to help people to get relief from life threatning situation. Being a caregiver for a living brings intangible rewards far beyond the monetary. Beyond respect, theres also flexibility- a nurse can choose where to work, which hospital depa rtment to work in, whether to work in a hospital at all, part time or full time. More and more nurse staffing agencies are popping up all over America, and many nurses choose to work exclusively through one or more of these agencies, which affords them the opportunity to work with lots of different people in various settings. Some use these agencies to sample a job at a particular hospital or other workplace, and if they like it, to apply for a full time position with the firm directly. Others dont want to be tied down, and like new and different experiences, and choose to work for agencies for years at a time. However nursing profession has benefit to design career in a way person wish.

Sunday, October 20, 2019

Accomplishment Report

Commercial property valuation requires a more complex method, taking into account the income potential of the property, historical revenue, cash flow with owner perks removed and much more. b. Residential Properties type of property is by far the most popular with both new and experienced agents. Real estate agents then further specialize in types of homes, including condominiums, separate homes, duplexes, high value homes, vacations homes, etc. c. Industrial Land situated in areas that are exclusively reserved and used for industrial purpose. 2. Land Improvement 3. Chattel is a term in the world which refers to personal property which can be moved; it is also known as movable property. Some examples of chattel include jewelry, cars, and furniture. Some people just call chattel â€Å"personal property,† differentiating it from things like real estate with the term â€Å"immovable property. † Assessors also look at building value under the cost analysis method, but only in terms of how much it cost to construct them. The corporation determines, at the outset of incorporating, how many shares it shall issue and what classes of shares (No Par, Par, Common, Preferred, Participating, tc. ) it will issue. Valuation Procedures 1. Cost Approach – valuation method is based on the principle that no prudent purchaser will pay more than what it cost him to acquire an equally desirable substitute site and to build a similar improvement of equal desirability and utility. 2. Income Approach – based on the principle that value tends to be set by the present worth of the right to future net benefits that may be derived from ownership. Important Document that an Appraiser should be able to look and verify: a. Sales invoice b. Letter of credit c. Deed of assignment CHAPTER 11 SHARES OF STOCKS AS COLLATERAL Capital stock has to do with all the shares of stock that represent the ownership of a given company. The exact number of shares that can be issued in the way of capital stock is normally recorded in the current balance sheet for a company. Capital stock will involve all types or classes of stock that the company is authorized to issue. The basis for issuing capital stock is normally outlined in the charter of the corporation. Common stock is stock in a company which comes with voting rights and an opportunity to share in the profits of the company. This type of stock is commonly issued by companies’ making offerings of stock and is a popular choice for people interested in buying and selling stocks. Prices for common stock vary depending on market pressures. Stock exchanges offer opportunities for people to buy, sell, and trade common stock with each other and with brokers. This type of stock should be contrasted with preferred stock, another type of stock which works slightly differently. Preferred stock offers several advantages over common stock. The first advantage is a fixed dividend, which generates more reliable returns than common stock; although it also means that the stockholder can miss out when large profits are made because the dividend will not be adjusted. Preferred stock, also known as non-participating preferred stock, is a type of stock that pays the investor a specific dividend only. In addition, in the event of a bankruptcy, preferred stockholders are ahead of holders of common stock, as are creditors, lien holders, and so forth. There are some advantages to holding common stock. Voting rights can be important because they allow people to vote on members of the board of directors, policy, and stock splits, which gives them a role in the governance of the company. Convertible preferred stock is a type of preferred stock that has the option of being converted into common shares issued by the same company. One of the less commonly employed approaches of issuing shares of stock. Participating preferred stock dividends are usually a fixed percentage of the par value of the stock. Participating preferred stock owners usually do not have any voting rights at stockholder meetings. Owners of common stock do have voting rights Cumulative participating preferred stock can accrue dividends that will be paid to the investor once the company’s performance improves. In finance, par value is the least amount that a share of stock can be sold for, according to the terms and conditions that are found in the regulations of the issuing company. CHAPTER 12 LAND AS COLLATERAL Collateral is borrowing funds often requires the designation of collateral on the part of the recipient of the loan. Collateral is simply assets that have been pledged by the recipient as security on the value of the loan. In the event that circumstances make it impossible for the recipient to repay the loan, ownership of the collateral is transferred to the entity that issued the loan in order to settle the debt. Function of Land it provides â€Å"standing room†. In spite of the fact man has learned to fly, and to dive under the surface of the water in submersible ships, we are still bound pretty close to the surface of the earth. Modes of acquiring title: . Public grant – acquisition of public land of homestead patent, sales patent and miscellaneous patent. 2. Private grant – voluntary transfer or conveyances as deed of sale, donation, exchange or assignment. 3. Involuntary grant – acquisition against consent of former owner, such as foreclosure or sale. 4. Inheritance – acceptance of hereditary succession. 5. Reclamation – filling of submerged land subj ect to government regulation and existing laws. 6. Accretion – more lands adjoining banks or rivers due to gradual deposit of soil. 7. Prescription – title by actual, open continuous and uninterrupted possession for a period of time under claim of title. Zonal valuation Different approaches to valuation of properties have been introduced in this country. In the case of land, not only it its price dictated by the interplay of supply and demand but moreover by the concept of zonal valuation instituted by the government. Government agencies like the Office of the Register of Deeds under the Department of Justice. IMPORTANT FACTORS IN OWNERSHIPPEACE AND ORDER The prevailing peace and order affects the value of the land. Today, a number of areas in the country are infested by the presence of bandits and other lawless elements like the NPAs for instance. They are known to have been exacting be so-called â€Å"revolutionary tax† on business establishment in such areas. Such deplorable conditions inhibit buyers from any interest. In buying such land or even in locating their business establishments. CHAPTER13 COLLECTION POLICIES AND PROCEDURES Collections are a part of a process in the accounts receivable or billing department. It means that, at some point in time, a company xtended to another company or an individual credit terms for goods or services, or a cash loan advance of some kind that was to be paid or repaid at a certain time. If that bill is not paid when it is due, or within an agreed upon grace period, the collection process begins. Collection procedures usually consist of a set of in-house company policies that are written in a manual or guidebook of some kind, though smaller compan ies may not have a manual. Usually, law firms that engage in collection practices will have manuals and training classes for their employees before they make their first collection call to a debtor. Most of the time, large corporations and small companies have a collection manager or collection department that will go through certain housekeeping procedures before an unpaid debt is turned over to a lawyer. Laws and Regulations The laws that cover collection policies and procedures are mandated by federal and state governments. On the federal level, the Federal Trade Commission regulates what is called the Fair Debt Collection Practices Act (FDCPA). In the case of a conflict between state and federal law, federal law prevails. Those who extend credit to others should be aware of the legal rules about how to collect money that is past due, particularly as those rules apply to bankruptcy. A collection policy is a set of business practices and procedures that outline the way a company goes about collecting money owed to it as a result of an extension of credit. Companies often allow their best business customers to establish payment terms that give the customer an extended amount of time, such as 30, 60 or 90 days, to pay an outstanding invoice. Other companies extend credit to individual consumers and implement a collection policy to control the process of obtaining payment on the credit account. Credit extensions allow individual consumers to obtain needed merchandise upfront but pay for purchase over time. In the case of business-to-business transactions, the extension of credit is carried on the supplier’s book under accounts receivable. Extensions of consumer credit are typically carried on the books under a separate consumer credit category that is also a type of receivable. Accounts receivable is a company’s list of outstanding extensions of credit to customers. The company’s collection policy establishes how the accounts receivable or collections department should go about reminding customers that payments are due and how the department should handle delinquent accounts or accounts that are not paid as agreed. Types of Bad Debt Buyers Also known as junk debt buyers, bad debt buyers are firms that purchase unpaid debts from different types of creditors at rates that are below the actual face value of the debts, and then attempt to collect the full amount plus interest and penalties from the debtor. Bad debt buyers sometimes specialize on securing and collecting specific types of debt, including credit card debt, business debt, or loan debt. Credit card bad debt buyers are one of the more common types of junk debt buyers. Here, the buyer purchases old credit card accounts with outstanding balances that the originator was unable to collect. CHAPTER 14 BANKRUPTCY Bankruptcy is the process where a person legally declares himself or his business unable to pay outstanding debts. Depending upon the type filed, one meets with a judge to determine a payment schedule, or have a legal bankruptcy discharge most if not all debts. Businesses also may declare bankruptcy, which either means the business will close, or that the business will continue to operate with reduced payments to debtors It depends on what type the person intends to file, and also how quickly he or she can gather together information about his or her income and debts. Bankruptcy is the most common proceeding, and it is usually filed when a person doesn’t have a large number of assets that he or she needs to protect. Financial distress may also occur due to unforeseen factors that have an adverse effect on the different revenue streams that the corporation enjoys. The bankruptcy action may be necessary to protect the business from creditors while the company is reorganized under the direction of the courts, allowing the corporation to at least have a chance of getting back on a firm financial foundation. Liquidation maybe partial or complete, depending on the amount of debt involved. With a partial liquidation, the business sells off assets, including divisions of the business that are not needed for the continued operation of the core businesses. A complete liquidation means the selling of all assets and the eventual dismantling of the company as a business entity. Insolvency is the inability of a person to meet his obligations as they mature (Equity sense). It refers to the excess of liabilities, in the case of corporation, excluding capital stock over assets. (Bankruptcy sense) Two types of Insolvency Voluntary Insolvency Under voluntary insolvency, an insolvent debtor, owing debts exceeding in amount the sum of P1,000. 00 may apply to be discharged from his debts and liabilities by filing a petition with the Court of First Instance of the province or city which is the domicile of the petitioner for six months preceding the petition. He shall moreover annex to his petition a schedule and inventory in the form as prescribed under the Declaration of Insolvency Upon receipt of such petition, together with the schedule and inventory, the court or the judge thereof in vacation, shall make an order declaring the petitioner insolvent. Involuntary Insolvency an adjudication of insolvency may be made on the petition of three or more creditors, residents of the Philippines, whose credits or demands accrued in the Philippines, and the amount of which credits or demands are in the aggregate of not less than one thousand pesos. Provided, that none of the said creditors has become a creditor by assignment, however made, within 30 days prior to the filing of said petition. The following shall be considered acts of insolvency, and the petition for insolvency shall set forth one or more of such acts: 1. That such person is about to depart or has departed from the Philippines, with intend to defraud his creditors; 2. That being absent from the Philippines, with intend to defraud his creditors, he remains absent; 3. That he conceals himself to avoid the service if legal process for the purpose of hindering or delaying or defrauding his creditors. 4. That he conceals, or removing, any of his property to avoid its being attached or taken in legal process; 5. That he has suffered his property to remain under attachment or legal process for 3 days for the purpose of hindering or delaying or defrauding his creditors; 6. That he has confessed or offered to allow judgment in favor of any creditor or claimant for the purpose of hindering or delaying or defrauding his creditors or claimant; 7. That he is willfully suffered judgment to be taken against him by default purpose of hindering or delaying or defrauding his creditors or claimant; 8. . That he has suffered or procured his property to be taken on legal process with the intent to give a preference to one or more of his creditors and thereby hinder, delay or defraud any of his creditor; 9. That he has made any assignment, gift, sale, conveyance, or transfer of his estate, property, rights, or credits for purpose of hindering or delaying or defrauding his creditors or claimant; 10. That he has, in contemplation of insolvency, made any payment, gift, grant, sale, conveyance, or transfer of his estate, property, rights, or credits; 11. That being a merchant or tradesman has generally defaulted in the payment of his current obligations for period of 30 days; 12. That for a period of 30 says he has failed after demand, to pay any money deposited with him or received by him in a fiduciary capacity; and 13. That an execution having been issued against him on final judgment for money, he shall have been found to be without sufficient property to execution to satisfy the judgment.