Featured Post

Japanese Management Essay

In the mid 1980s, William Ouchi attested in the U.S. soil the noteworthiness of Theory Z (1981), a Japanese administration style that rememb...

Sunday, January 26, 2020

Literature Review On Web Usage Mining Computer Science Essay

Literature Review On Web Usage Mining Computer Science Essay The Internet has become the largest database ever existed. During the early times of this decade it has been estimated that the internet is having more than 350 million pages [11]. But through a research conducted a few years ago it was found that the indexed part of World Wide Web alone consists of a minimum of 11.3 billion pages [12]. Also the number of people using the internet is growing exponentially. A survey conducted by Computer Industry Almanac itself is an evident for this fact. According to the results of the survey the number of online users had crossed one billion in 2005 while it was only 45 million in 1995. They had also predicted the number to cross two billion by 2011[13]. For the users of the internet finding the required information from this large volume of data has become extremely difficult. So it has become essential to find efficient ways for information retrieval. Also it has been found that more than 90% of the data is in unstructured format. So organizing and structuring this data has become a very important issue among researchers. With this large amount of information available on the web business processes needs to transcend from simple document retrieval to knowledge discovery. Business people were trying to get useful patterns from the available data which will help them for better understanding of their customer needs which in turn provides better customer satisfaction. Literature Review on Web Usage Mining Web mining helps the web designers in discovering the knowledge from the information available in the web. Also it helps the users in getting the fast retrieval of the information they are looking for. Three major areas of web mining are Web content mining- Trying to get useful information from the text, images, audio and video in web pages Web structure mining- Trying to understand the link structures of the Web which will help in categorization of Web pages. Web usage mining- Trying to get useful information from the server logs to understand what the users are looking for. It also helps in personalization of web pages. Though all the three categories of web mining are interlinked, in this research we were going to discuss about the web usage mining. Web usage mining helps the web masters to understand what the users were looking for so that they can develop the strategies to help the user to get the required information quickly. Web mining is generally implemented by using the navigational traces of users which give the knowledge about user preferences and behavior. Then the navigational patterns were analyzed and the users were grouped into clusters. The classification of navigational patterns into groups helps to improve the quality of personalized web recommendations. These web page recommendations were used to predict the web pages that are more likely to be accessed by the user in near future. This kind of personalization also helps in reducing the network traffic load and to find the search pattern of a particular group of users. Data mining techniques like, clustering, sequential pattern mining and association rule mining were used in web mining. All these techniques were used to extract interesting and frequent patterns from the information recorded in web server logs. These patterns were used to understand the user needs and help the web designers to improve the web services and personalization of web sites. Web Access Sequence Generally the web usage mining will be done based on the navigation history stored in the logs of the web server. This navigation history is also called as Web Access sequence which will contain the information about the pages that a user visit, the time spent on each page and the path in which the user traverse with in the website. So the web access sequences will contain all the details of the pages that a user visited during a single session. This data that we get from the log files will be subjected to various data mining techniques to get the useful patterns which can describe the user profile or behavior. These patterns will act as the base knowledge for developing the intelligent online applications, to improve the quality of web personalization, web recommendations etc The web mining can be generally classified in to two categories online mining and offline mining. In offline mining we use the data stored in the log files to find the navigational patterns while in online mini ng the requests of users in his current active session will be used. Current user profile will be decided by matching the recommendations from both the online and offline methods. Several systems have been designed to implement the web usage mining. Among many Analog is one of the first systems developed for Web Usage mining. It has two components online component and offline component. The offline component will reformat the data available in the log file. Generally the web server log will contain the information like IP address of the client, the time in which the web page is requested, the URL of the web page, HTTP status code etcà ¢Ã¢â€š ¬Ã‚ ¦ Then the data available will be cleaned by removing the unwanted information after which the system will analyze the users activities in the past with the information available in the log files of the web server and classify the users session into clusters. Then the online component will classify the active user sessions based on the model generated by the offline component. Once the user group is found then the system will give a list of suggestions to each user request. The suggestions will depend on the user grou p to which the user belongs. Clustering Techniques One of the important portions of web usage mining is the process of clustering the users in to groups based on their profile and search pattern. The clustering of the users session can be done in several ways. Christos et al. represents each page as a unique symbol which makes the web access sequence to a string [1]. Consider S as the set consisting of all possible web access sequences. Then the web mining system will process this set S in offline as a background process or during the idle time to group the pages in to clusters such that similar sequences were in the same cluster. The formed clusters were represented by means of weighted suffix tree. The clustering is done by constructing a similarity matrix [1] which then is given as input to k windows clustering algorithm [10] to generate the clusters with very similar Web access sequence. When two web access sequences have the same length then the global alignment has to be taken into account rather than the local alignment. Also the scores were calculated for both the local and the global alignment. A simple way to calculate the scores is to assign a positive value to a matching sequence and a negative value for a mismatch. Two web access sequences were said to be similar if they have the maximum alignment in their sequence. Some times the web pages listed in the sequence may be unimportant to the user. The user may have reached that page by a wrong click. In such cases the users will immediately leave that page. So the user will stay only for a short time in these kinds of unimportant pages. So before considering the web sequence alignment we have to take care of all these factors in order to get the useful patterns. C.Suresh et al had proposed an approach in which the clusters were identified based on the distance based clustering methods also they had developed a framework to compare the performance of various clustering methods based on the replicated clustering. In traditional methods, the distance between two user sessions will be calculated using the Euclidean-distance measure. But experiments show that the Sequence Alignment Method is better in representing the behavioral characteristics of web users than the Euclidean-distance method. Cadez et al [14] categorizes the users session as general topics and the behavior of each particular topic is represented by morkov chain. Fu et al. [15] uses Balanced Iterative Reducing and Clustering using Hierarchies (BIRCH) algorithm for clustering at page level. BIRCH is a distance-based hierarchical algorithm and it is used for clustering the web user sessions. It has been noticed that the increase in the number of pages is diminishing the performance of the BIRCH algorithm. Since each website contains hundreds of pages considering each page as a separate state will make the clustering unmanageable. To overcome this difficulty the authors proposed an approach to generalize the sessions using attribute-oriented induction. In this new approach the clustering of pages will be done at the category level. It is has been always a difficult job to map a particular page to a specific category but it can be done by using clustering algorithms. The commonly used algorithm for clustering is k-means algorithm but the major disadvantage of k-means algorithm is, we have to specify number of clusters to be found in advance which is not possible in real world scenario. To overcome this problem, researchers were using the fuzzy ART neural networks, an unsupervised learning approach. In this approach there is no need to specify the number of cluster in advance. The main issue with fuzzy ART neural network is the category proliferation which leads to the unrestricted growth of clusters. Sometimes the fuzzy ART network will produces a large number of clusters with only a few members in each cluster. After considering the merits and demerits of both the algorithms the authors had proposed a hybrid approach called FAK. The FAK algorithm has two phases in the first phase fuzzy ART is used as an initial seed generator to generate the clusters. From the identified clusters we will remove the cluster whose centroids were nearer to others t hereby addressing the category proliferation problem. The first phase will be followed by applying the k-means algorithm in the second phase to get the final clusters. They found that the FAK algorithm performs much better than the other methods. The most important to be considering during the clustering is the number of user sessions that should be taken into account for clustering. In most cases the designers will decide it is enough to consider the first N sessions of a user for the decent recovery of his web cluster. Also they will decide whether to consider or not the sessions with the short session lengths because those sessions may not be helpful in identifying the clusters. So the two main factors we have to consider while performing the clustering is the number of user sessions to be considered and the minimum session length. Combining the web content mining and web usage mining An experiment was conducted in order to extract the navigational pattern of the websites user [6]. The experiment aims at predicting the users gender and whether they are interested in certain website sections. The results of the experiment were analyzed and it is found that the model was only 56% accurate. The reason for low accuracy is found to be the failure to include the web content in the classification model. It is believed that exploring the content of the page will helps in better understanding of the user profile thereby the classification accuracy will be improved. The web usage mining and web content mining can be combined together and it is used in the area of web personalization. In the web personalization the contents of the web page will differ for each user according to their navigational pattern. In this technique the web links that the user may visit in the near future will be predicted based on their profile. Those predicted links will be dynamically displayed in the webpage that the user requested. Also web links of the frequently visited pages will be highlighted at the same time pages which were not visited for a long time will be removed. This hybrid approach is implemented by doing an initial clustering based on the contents of the web pages followed by the Web Access Sequence alignment. Text clustering can be done effectively by spherical k-means algorithm [10]. Since the multiple sequence alignment consumes more time and space it can be effectively replaced by the iterative progressive alignment. Weighted suffix tree is used to find the most frequent and important navigational pattern with little memory and computational costs. It has been proved that the content exploitation has improved the performance by 2-3%. In the model proposed by Liu [7] the contents of the web pages and the information from the web server log files is used together in order to extract the meaningful patterns. The extracted contents of the web page represented by means of character N-grams. The users of the web site can be classified by two approaches namely proactive and reactive approach. The proactive approach tries to map each request to a user before or during the users interaction with the site. While the reactive approach maps each request to a user after the user completes the interaction with the site. In order to use the proactive approach the browser cookies needs to be enabled and it is necessary that the user must be aware of cookies. So it is always easy to use reactive approach which does not require the user to have any prior knowledge. An experiment has been conducted with 1500 sample sessions to evaluate the proposed method. The results show that the system is 70% accurate in classification and 65% i n prediction. The success of the website also depends on the user perceived latency of documents hosted in web server. It is obvious that short User Perceived Latency will have a positive effect towards the user satisfaction. This has been also proved by a study conducted by Zona Research Inc. in 1999. The study shows that if a web site takes more than eight seconds to download then a 30% of the visitors is more likely to leave the site [35]. User perceived latency is influenced by many factors like fast internet connection, increasing the bandwidth of ISP, etc One way to reduce the UPL is by using the browser cache. In this approach frequently accessed will be pre-fetched and stored in browser cache. Generally Web cache is implemented by using the proxy server. All the requests from the user to a web server will be interpreted by the proxy servers. If the proxy server has a valid copy of the response then it will give the results to the users. Otherwise the requested will be forwarded to the original server. The real server will send the response to the proxy server. The proxy server will retain a copy of the response in the cache and then send the results to the users. The main Problem with the web cache is if the cache is not up to date then the users will be provided with stale data. Also if a large amount of users access a web server simultaneously then it may results in severe caching problems which may results in the unavailability of the web pages. To overcome all these issues author suggested an approach which combines the web prefetching and caching together. In this approach we have to first identify the objects that needs to be pre-fetched in a web cache environment fr om the information available in the log files. After identifying the objects we have to group these objects into clusters for each client group. When a user requests for an object first he will be assigned to one of the client group then the proxy server will fetches all the cluster objects of that particular client group. Finally the requested object will be delivered to the user. To have a minimal UPL we have to predict the users preferences based on the pages he had already visited. The importance of the page is determined by the weights assigned to the page by the algorithm. If two or more pages have the same weight then we rely on the page rank. Page Rank [22] is A probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. Page rank can be assigned to the document of all sizes. If a document is having a page rank of 0.5 then it means that if a user clicks on a link randomly then there is 50% probability for the link to land on the particular document. Consider a web site consisting of four pages P1, P2, P3, and P4. Initially the page rank of all the pages will be same. Since there are four pages each page will be assigned with a page rank of 0.25. If all the pages P2, P3 and P4 posses link to only P1 then each of the three pages will contribute 0.25 page rank to P1. The page rank of P1 can be calculated by using the following formula. PR (u1) = PR (u2) + PR (u3) + PR (u4) If suppose page P2 is having another one link to page P3 and if P4 is having links to all the three pages then the link-vote value will be divided among all the outbound links of a page. So Page P2 will contribute 0.125 to page P1 and 0.125 to page P3. Similarly P4 will contribute one third of its page rank value to P1. The general formula for calculating the page rank value for any page is as follows. PR (u) =à ¢Ã‹â€ Ã¢â‚¬Ëœ PR (v) Bp is the set of all pages linking to page P L (v) is number of links from page v Web Page recommendation is an important section in web personalization. Weighted Association rule mining is used to predict the page recommendations. In this method we assign a weight to each web page. The importance of the page is determined by the weight assigned to the page. In this approach the weight for each page is assigned based on the frequency in which the page is visited by the user and the time spent by the user on a particular page. The frequency weight of a page FW is calculated using the following formula FW (p) =Number of visits on a page (P) Total number of visits on all pages X PR (P) Where PR is the page rank of p. Time spent on each page reflects the relative importance of each page for a user. Bcause the user will spend more time on the page he is interested in and he will quickly traverse the other unwanted pages. The two factors that we have to consider while calculating the actual time spent on a page is as follows. The size of the web page and the transfer rate, the assumption is that the transfer rate is constant for a user then the time spent on a page will be inversely proportional to the amount of useful information available to the user from that page. The weight of a page can be calculated using the following formula TW (P) =Time spent on a page (p)/page size of (p) Max pÆ P Time spent on a page (p)/page size of (p) Based on these two values the total page weight is calculated as follows W (p) =FW (p) +TW (p) According to the page rank algorithm, the link to important page will appear as an outbound link in many pages. Web prefetching reduces the latency. In prefetching the networks idle time is utilized to fetch the anticipated web pages. In [7] Chen et al. proved that the cache hit ratio had enhanced to 30-75% through pre-fetching. Also the access latency can be reduced by 60% when we combine the caching and pre-fetching techniques [8]. Pre-fetching takes place only if the network bandwidth is less than the predetermined threshold also only the web pages that are not available in the cache will be pre-fetched. Pre-fetching increases the network traffic but at the same time it helps in reducing the latency. Several approaches were available for web pre-fetching like Top-10 approach, Domain-top approach, etc In top-10 approach web proxies will be periodically updated by the web servers regarding the most popular document information. The proxies will then send this update information to the clients [9]. In domain-top approach the web proxies will first search for the popular domains and then it looks for the important documents in each domain. A suggestion list will be prepared for the user with the knowledge of the proxy server about the popularity of domains and documents and this list will be used for users future requests. In this dynamic web pre-fetching technique user preference list will be maintained for each user which will contain the list of web sites which will be available for immediate access to the user. This user preference list will be stored in the database of the proxy server. Dynamic web pre-fetching technique uses the intelligent agents to monitor the network traffic. When ever the network traffic is low then the system increase the pre-fetching similarly in a heavy traffic it will reduce the pre-fetching thereby helps in utilizing the idle time of the network also maintains the traffic constant. The number of web links to be pre-fetched depends on the usage of bandwidth and weights of the web pages. While assigning the weights to the web pages preferences will be given to the links which are accessed frequently and recently. By using this technique the cache hit ratio has been increased by 40-75% and latency is reduced to 20-63%. The log file modeling is the important task in web usage mining. If we use an accurate model for modeling the web log file then the accuracy of web page prediction scheme will also increase. The most commonly used model is Markov model. In markov model each page represents the state and the visited sequence of a pair of page represents the transition between two pages. The accuracy of the traditional first order markov model is less because of the lack of in-depth analysis. In contrast the second order markov model is more accurate but the time complexity of the model is high. In [11] they had proposed a hybrid approach called dynamic nested markov model in which the second order markov model is nested with in the first order markov model. In the dynamic markov model the insertion and removal of nodes is much easier. The node will contain all the information about the web pages like the web page name, inlink list which is a list that contains the name of the previous web page, count of the web page which will represents the number of times the current web page is reached from the previous web page and the outlink list contains the list of nodes which contains the name of the next web page and its count. In this model the number of nodes will be always same as the number of web pages. Since we have replaced the transition matrix structure in the traditional markov model with the dynamic linked list the time complexity of the proposed model is less than the traditional model. Also the model covers The experiment conducted with the web site that serves 1200 users and receives a minimum of 10,000 requests per day. The experimental data is split into three sets DS1 which contains 3000 pages, DS2 with 1000 pages and DS3 with 1500 pages. It has been shown that DS1 has taken 537 ms and DS2 has taken 62 ms and DS3 has taken 171 ms. So it is evident that time taken for DNMM generation is directly proportional to the number of web pages and the size of the log file. The latency can also be reduced by client side pre-fetching. A prefetching model proposed by Jaing [3] is based on the users search pattern and the access rate of all the links in a web page. Each link will have a counter which will be incremented whenever it is clicked by a user. The access rate is the ratio of the links counter value to the counter value of that particular page. The pre-fetcher will fetch the web pages whose access rate is high. The main advantage of this model is that this model can be executed independently in clients machine. But the disadvantage of this model is that it will increase the processing overhead of the clients computer. Initially the web pages will contain one HTML document it may include some images. But in recent days several HTML documents were embedded in to a single web page. In such cases the browser displays the embedded documents along with the requested documents. These embedded documents decrease the prediction accuracy of the system. Also if the user requests a page by typing the URL in the browser navigation bar then these requested will not be taken in to account by any link analysis method. To overcome these drawbacks kim has proposed a prefetching algorithm. In this algorithm the request patterns were represented by means of a link graph. The nodes of the graph represent the unique URL of the HTML documents and the edges represent the hyperlink or an embedded link and it is directed from the referring document to the referred document. When a user requests a webpage then the access counter value of the node corresponding to that particular web page or document will be incremented by o ne. Also when a user traverses from one page to another page then the access counter value of the corresponding edge will be incremented by one. It has been assumed that the user is browsing a page that is displayed on the browser if the user does not make another request within a minimum interval of time. By this time the prefetching module will be executed and the prefetched documents will be stored in the cache. Agarwal, R. (2010). An Architectural Framework for Web Information Retrieval based on Users Navigational Pattern. Time, 195-200. Dimopoulos, C., Makris, C., Panagis, Y., Theodoridis, E., Tsakalidis, A. (2010). A web page usage prediction scheme using sequence indexing and clustering techniques. Data Knowledge Engineering, 69(4), 371-382. Elsevier B.V. doi: 10.1016/j.datak.2009.04.010. Georgakis, a, Li, H. (2006). User behavior modeling and content based speculative web page prefetching. Data Knowledge Engineering, 59(3), 770-788. doi: 10.1016/j.datak.2005.11.005. Jalali, M., Mustapha, N., Mamat, A., Sulaiman, N. B. (2008). A new classification model for online predicting users future movements. Architecture, 0-6. Kim, Y., Kim, J. (2003). Web Prefetching Using Display-Based Prediction. Science And Technology, 0-3. Liu, H., Keselj, V. (2007). Combined mining of Web server logs and web contents for classifying user navigation patterns and predicting users future requests. Data Knowledge Engineering, 61(2), 304-330. doi: 10.1016/j.datak.2006.06.001. Nair, A. S. (2007). Dynamic Web Pre-fetching Technique for Latency Reduction. Science, 202-206. doi: 10.1109/ICCIMA.2007.303. Nigam, B., Jain, S. (2010). Generating a New Model for Predicting the Next Accessed Web Page in Web Usage Mining. 2010 3rd International Conference on Emerging Trends in Engineering and Technology, 485-490. Ieee. doi: 10.1109/ICETET.2010.56. Pallis, G., Vakali, a, Pokorny, J. (2008). A clustering-based prefetching scheme on a Web cache environment. Computers Electrical Engineering, 34(4), 309-323. doi: 10.1016/j.compeleceng.2007.04.002. Park, S., Suresh, N., Jeong, B. (2008). Sequence-based clustering for Web usage mining: A new experimental framework and ANN-enhanced K-means algorithm. Data Knowledge Engineering, 65(3), 512-543. doi: 10.1016/j.datak.2008.01.002. S. Chakrabarti, M. van der Berg, B. Dom, Focused crawling: a new approach to topic-specific web resource discovery, in: Proceedings of 8th Int. WorldWide Web Conf. (WWW8), 1999. A. Gulli, A. Signorini, The indexable web is more than 11.5 billion pages, in: Special interest Tracks and Posters of the 14th International Conference onWorld Wide Web, Chiba, Japan, 2005. A. Banerjee, J. Ghosh, Clickstream clustering using weighted longest common subsequences, in: Proc. of the Web Mining Workshop at the 1st SIAM Conference on Data Mining, 2001 I. Cadez, D. Heckerman, C. Meek, P. Smyth, S. White, Visualization of navigation patterns on a Web site using model-based clustering, in: Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD-2001), Boston, MA, 2000, pp. 280-284. Y. Fu, K. Sandhu, M.-Y. Shih, A generalization-based approach to clustering of Web usage sessions, in: B. Masand, M. Spiliopoulou (Eds.), Web Usage Analysis and User Profiling: International WEBKDD99 Workshop, San Diego, CA, Lecture Notes in Computer Science, vol. 1836, Springer, Berlin/Heidelberg, 2000, pp. 21-38.

Saturday, January 18, 2020

Childhood vs Adult Learning

————————————————- ————————————————- ————————————————- Childhood Versus Adulthood Learning ————————————————- ————————————————- ————————————————- Tricia Barnes ————————————————- COM/156 ———â€⠀Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€Ã¢â‚¬â€- January 19, 2012 ————————————————- John Likides ———————————————— There is no question about it, children and adults learn in different ways. The argument can be made about which one is better, and they are numerous schools of thought on the theories for each, but the bottom line is that there is a clear variation between how a child learns and how an adult learns. There is a vast importance for learning at both the childhood and adulthood levels. As a child, one must learn on more of a basic, survival mindset in order to overcome the challenges that are present in the first few years. Although, as an adult, the skills and cognitive abilities that were discovered as a child must be expanded and improved in order to meet the tasks appropriate for each growing age level. As a child, becoming familiar with different facts and ideas sets the groundwork for the knowledge that we hope to achieve as an adult. By establishing a good foundation, the process for learning as an adult can be adapted and improved upon to meet the progressing needs. The four main childhood learning heories are Maturationism, Environmentalism, Constructivist, and Stage-based Teaching. The four main adulthood learning theories are Life Experiences, Speck’s theory, Andragogy theory, and Jarvis’s learning process. Each one of these theories attempts to exemplify the processes and skill sets that each deems important to the learning process. One of the pertinent childhood learning theories, Maturationism, deals with the idea that the process by which we learn for the first couple of yea rs is based on markers in our DNA (Hunt, 1969). Most people in this school of thought believe that education and environmental factors merely plays a supportive role to child development, while certain instincts imbedded in our genes actually govern around what age we learn thing like how to talk or walk. These factors can be manipulated and intensified by outside factors, but the main governing fact behind early childhood development is based around a Darwin like evolutionary instinct. Many advocates of Maturationism believe that holding a child back or starting a child a year late for school may be more beneficial in the long run, because they child is not at the proper developmental maturity to be able to handle that level of information, exemplifying the idea that a mind can only handle the information that it is developed to receive (DeCos, 1997). Environmentalism is another theory at the forefront of child development. Environmentalism is in fact the contrast to Maturationism theory in that it supports the idea that a child’s development and learning is shaped by their environment and outside factors. The environmentalist theory enforces the idea of recitation and repeating, according to this theory, this is how children learn. By incorporating the outside experiences and storing them, they are able to build upon those ideas and improve upon them to learn (Skinner, 1938). It is deemed essential, and if a child is deprived of these factors, will not be as well educated or able to cope with higher learning as well as a child that was introduced to this Some argue that this is why children who come from enriched lifestyles are less likely to succeed in school as those who prepare better in infancy and young ages. Another key theory is that of Constructivistism. This theory provides that children are active learners in their education, and a child’s development is based on their motivation and abilities to seek out information (Atherton, 2010). In practice, this theory implements an active learning setting, allowing students to become involved in the learning, introducing toys such as puzzles or blocks that stimulate active interactions, thereby allowing the child to take a more participant attitude in their learning. Should a child encounter problems in their learning, this theory supports the idea of channeling the process into a one on one, and more individual learning secession in order to improve on those weaknesses. One big supporter of this theory is Jean Paiget, a very well noted child psychologist Paiget has provided countless studies and supports the fact that most of what a child learns at young ages is what they deem pertinent and important to them. In contrast to learning theories established for children, there are equally as many important to that of studying the learning process of adults. A major theory that is easily identifiable is that of the Life Experiences. Children display this theory to a degree, however, the lasting effects ten to be greater in adults. On an evolutionary basis, children use life experiences to know that falling down hurts, or to stay away from a dish once they realize it’s hot. These process are more involved on a cognitive level, and don’t play particular attention to an overall learning process. When you are a child and someone takes your toy or pushes you down, you don’t tend to be as upset or concerned, and it’s usually something that can be easily forgotten. As adults, the value of the lessons learned from life experiences tend to be much more significant, and therefore there is more emphasis on the learning applications of said methods (Lieb, 1991). For example, for most people it takes only getting robbed once to start locking up their belongings. In that sense, adults are not only able to draw from their own life experiences, but also of that as a society. For instance, there are many people who have never had a car accident, but barring laws, many would still choose to wear a seat belt, just due to the fact that is has been proven by other life experiences to be useful for saving lives and preventing injury In 1996, educational specialist Marsha Speck designed what is known as Speck’s Theory of adult education. This theory is a minor variation of the Constructivism learning theory more or less with the addition of ego in adult learners. The theory offers that an adult will only pursue learning that is significant to them in one way or another, but they should rely on peer support and not be fearful of judgment (Speck, 1996). As adult learners, they must also be shown the effect of their knowledge in an applicable setting, in most cases. Most children follow after ideas and concepts that make them happy, however adults often times cannot maintain that luxury. Therefore, to gain the knowledge necessary, an adult learner must be shown the impact. In the military, for example, often times there are many by gone traditions and customs that many are unable to identify with until they learn the importance and usefulness of the given information. The Andragogy theory is another theory that is relevant and in practice with the study of adult learning and professional development. In this theory, the main concern is process not product. It is stated that adults tend to value the experience and methodology over the actual content that they are left with at the end. By this process, emphasis is put on real world learning and role playing situation (Knowles, 1984). The idea of getting a student out of a classroom and into a situation where they can actually learn as they go along is said to have a better and more powerful impact then taking notes or reading the process from a book. For instance, most students in trade career fields in particular tend to exemplify this philosophy in the method of applying more hands on and internship training into their curriculum. Vocational-Technical schools demonstrate how, even at a learning level, students are able to grasp enough of a trade to be able to iron out their abilities through hands on applications. Another good illustration of this theory is in the military, whereby the majority of the training a given individual achieves comes not from their book based learning, but from real world on the job training. In this sense, the student is able to get immediate gratification and can see the importance of the concepts learned immediately. Both childhood learning theories and adulthood learning theories are important to every aspect. Depending on the subject being taught should govern the method behind which theory should be applied. To learn second languages, many adults approach this with a mindset very difficult to breakdown, and therefore many find it very difficult. Children, on the other hand, are able to grasp a second language far easier. The argument purposed by Maturationists would be that children have a predetermined timeline for how learning occurs, and therefore children searching for a way to communicate their thoughts are able to pick up on more than one language at time, as their minds are ripe for that form of knowledge (Hunt, 1969). The largest problem for adult learning is ego and close-mindedness. Most adults are just unable to get out of their own way in order to understand new topics. There are also differences in certain areas where adults are able to learn certain things at a much faster rate than children, and the most representation of this is in the life experiences theory. Children are able to learn simple concepts, but things like guilt, jealousy, and love are not things that children are able to grasp. These abstract emotions can’t be taught, even at a childhood level; instead they must be learned on an individual level, as the knowledge is not necessarily universal, but more individual. Overall, there are a number of different theories and concepts behind each level of development in an individual. By classifying them, it can be noted what works best and what can be altered. In this way, the living organism that is the education system is dynamically and constantly changing. By dissecting how children learn, it is possible to improve on how adults can pick up on aspects like learning a foreign language, and children are able to learn thing like team dynamics. The open-mindedness and new age looks at education have shown how many different ways there are to teach, no matter what your age or learning style. References DeCos, P. L. (1997, December). Readiness for kindergarten: What does it mean? Sacramento, CA: California Research Bureau, California State Library Atherton, J. S (2010) Piaget. Learning and Teaching; Piaget’s developmental theory. Retrieved July 29, 2010, from http://www. learningandteaching. info/learning/piaget. htm Hunt, J. M. (1969). The impact and limitations of the giant of developmental psychology. In D. Elkind & J. Flavell (Eds. ), Studies in cognitive development: Essays in honor of Jean Piaget. New York: Oxford University Press. Knowles, M. (1984). The Adult Learner: A Neglected Species (3rd Ed. ). Houston, TX: Gulf Publishing. Lieb, Stephen. (1991, Fall). Principles of adult learning. Vision. Retrieved July 28, 2010, from http://www. economist. com/china Skinner B F. (1938) The behavior of organisms: an experimental analysis. New York: Appleton-Century-Crofts. Speck, M. (1996, Spring). Best practice in professional development for sustained educational change. ERS Spectrum, 33-41.

Friday, January 10, 2020

Superpower Has No Moral Duty to Intervene in Foreign Tyrannies

Lord Acton said â€Å"Power corrupts absolutely & absolute power corrupts absolutely. † And it is no wonder that super power corrupts superbly. Honourable panel of the jury, respected teachers & all my dear friends, and of course not to forget my worthy but disillusioned opponents, a very warm good afternoon to one & all. Nowadays the burning topic is whether superpower can involve in foreign affairs. Today I master Bright James George would like to speak against the motion ‘Superpower has a moral duty to intervene in foreign tyrannies. In this contemporary world, superpower has a habit which they call as a ‘moral intervention’. But I refer this as ‘poking their nose in other’s matter’. Since when the superpowers rose into this world, they found this intervention advantageous for them on the other hand harmful for others. I hope my opponents know that every members of United Nations has to abide to its policy. In December 1965, UN General Assembly had declared a policy stating that ‘no country has to intervene in whatsoever affairs of other countries. My opponents cannot turn a blind eye to these events: US-Vietnam war, Russia Chechnya Conflict. Is this what my opponents call these superpowers’ moral duty by violating the UN policies? You got to think twice, my friends. I fail to understand why my opponents believe in the so called moral duty of superpower intervention. Do superpowers know other country’s background, culture, tradition etc.? They know only a little and remember little knowledge is always dangerous. With a little knowledge how can they intervene in foreign tyrannies? It doesn’t sound good too either. When a country allows superpower to intervene in their tyrannies, it shows the incapability of the government to control that country. Moreover all the success of that country becomes the glory of the superpower. What a shame then! A country must stand on its own foot. It can only seek help in a very critical situation. My opponents should not forget that in this politically cut-throat competitive world, one never intervenes unless & until if he or she gets a profit out of it. Superpowers do the same. They only intervene in matters for amassing wealth & especially to quench the thirst of black water, that is, petroleum. It is no doubt that the motive of the Iraq war was to protect US economic interests- American access to gulf oil. Countries like the U. S. have misused its position as superpower says the global political scholar Francis Fukuyama claims â€Å"Twenty years since the fall of the Berlin Wall and the accepted end of the Cold War, the United States has misused its position as a superpower. † When President Bush sent US troops to Somalia in 1992, he cited a humanitarian reason: to feed the starving Somali Population. But paradoxically civil Order has broken down, and starvation was usedas a weapon against innocent people. The American troops were engaged in a manhunt for warlord Aideed. This led to ferocious fire fight on October 3, 1993 the search was in futile. Nothing substantial was accomplished. Only misery & sorrow. Percy Bysshe Shelly in his Queen Mab says, â€Å"Power, like a desolating pestilence, Pollutes whate’er it touches; and obedience, Bane of all genius, virtue, freedom, truth, Makes slaves of men, and of the human frame A mechanized automaton. † According to me, â€Å"Superpower, like a destructing nuisance, Pollutes whate’er it intervenes. † For instance, the former Yugoslavia, the Serbs, Bosnians etc. fought for nationalism. But as the superpowers like Russia intervened for altruistic reasons, things became worse. The final result was the disintegration of Yugoslavia. U. S & China doesn’t give India permanent membership in UN General Assembly. France accepted, UK accepted. Why not the Americans & the Chinese? Just a mere fear of India becoming superpower and superpower becoming soft power. Afraid of India becoming superpower, the present superpower USA provides arms & ammunitions to Pakistan, so as to suppress India. However my obstinate opponents are still under the illusion of superpower intervention. Why superpower wants to involve in others matter, when they have problems like poverty, economic recession etc. in their country itself? To conclude, I would like to say â€Å"superpowers should not mess around with some other countries till theirs is in order. † Finally those who believe that ‘Superpower has a moral duty to intervene in foreign tyrannies’ I believe they live in the fool’s paradise. Thank you

Thursday, January 2, 2020

Analysis On Risk Management By Contractors Who Work On...

CHAPTER 3.0: RESEARCH METHODOLOGY 3.1 Brief Case background The research takes a case study approach. The case study analysis dwelt on risk management by Contractors who work on energy and utility construction projects, including strategies and supporting structures for managing risks, complete with an analysis of how these strategies and structures are implemented and supported by the Contractors resources base. The researcher specifically chose utility contractors for this study as the Energy and Utilities sector play an indispensable role in the global economy and in the UK, industry employs around 2% of the UK workforce (AGCAS, 2012). Moreover, the UK government identified the Utilities companies as companies that are heavily involved in risky incidents affecting their sector thus playing a crucial role in the preparation and planning for emergencies responsibilities (UK Government, 2013). According to Yip (2003) construction is risky as it almost always certainly involves loss of time and money. Above all, any denial of service during ou tage result in impact on communities (Lindman, 2008) and utility services are no exception. Against this background, it could be argued that contractors working on construction projects undoubtedly play a significant role in managing risks in order to stay in business. 3.2 Validity of Case Study as a Method of Research The case study of utility contractors was undertaken. Yin (2003) recommends the used of case study when the focusShow MoreRelatedBackground Information : Borton Lawson Engineering1906 Words   |  8 Pagesfirm’s management remaining focused and proud for having being an organically built and sustainably growing business. The firm specializes in the provision of both architectural and engineering design services to all types of customers (Corporate, governments and individuals) worldwide. The company has been in operation for 27 years. All this time, there has been tremendous gain in experience in different construction project done. They have mastered the art of handling construction project with professionalismRead MoreProposed Development By Responsible Property Development 1570 Words   |  7 Pagesindustry, befitting of the mixed income target market. Core to our development strategy is an awareness of the connection between the social, environmental and financial performance of buildings (Lorenz and Lutzkendorf, 2008; RICS, 2005). In order to de-risk the property portfolio and add value and generate an alpha, Responsible Property Investments (RPIs) for Phase 1 are of a fixed income (debt based) or equity real estate types, utilizing off plan generated sales and joint venture capital in conjunctionRead MoreThe Riba Plan Of Work3337 Words   |  14 PagesThe RIBA Plan of Work 2013 (Royal Institute Of British Architects, 2013) is a document that was created by a rchitects to aid with the design process of a building. The Plan provides a clear structure within its eight work stages (0-7), the eight work stages outline tasks and outputs that are required at the end of each stage. The Plan of works started in 1963 and has been revised 5 times since then, many contracts are related to or based on the plan of works. Stage 0 Strategic Definition Stage 0Read MoreProject Management Of The Darlington Nuclear Power Generating Station5062 Words   |  21 Pages â€Æ' Table of Contents Context/Introduction 3 Who I am 3 The Approach 4 Estimating Methodology 6 General 6 Project Management 7 Engineering 9 Labour Norms 10 Norms 10 Productivity 10 Material Prices 11 The Result 11 Lessons Learned 14 Competencies Demonstrated 15 Table of Figures Table 1: AACE Estimate Classification Matrix. 5 Table 2: Project Management Example 8 Table 3: Engineering Example 9 Table 4: Example of Labour Productivity 10 Table 5: Example Elemental Comparison 13 Table 6: CompetenciesRead MoreImpact Of Engineering On Society And The Environment2247 Words   |  9 PagesIndividual Assignment Question 1: Impact of engineering on society and the environment The project proposed for the final evaluation is a Graduate Residence building for the University of Ottawa. The location of the residence is 149 Chapel Street, which is located in the middle of Downtown Ottawa close to a lot of amenities and major infrastructures. Also there is a lot of construction going on in the area. The building will be welcoming a lot of graduate students and will facilitate the transportRead MoreRole of Project Planning in Improving Construction Project Delivery15796 Words   |  64 PagesTHE ROLE OF PROJECT PLANNING IN IMPROVING CONSTRUCTION PROJECT DELIVERY A CASE STUDY OF THE NIGERIAN CONSTRUCTION INDUSTRY BY EGWIM IKENNA OKECHUKWU 20011144146 PROJECT SUBMITTED TO THE DEPARTMENT OF PROJECT MANAGEMENT TECHNOLOGY IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE AWARD OF THE B. TECH. DEGREE IN PROJECT MANAGEMENT OF THE FEDERAL UNIVERSITY OF TECHNOLOGY OWERRI, IMO STATE NIGERIA. OCTOBER 2006. CERTIFICATION This is to certify that this researchRead MoreCommunication Management Challenges in Construction Project Execution63139 Words   |  253 PagesGo Up to Table of Contents |    | Go To Chapter 2 (Organizing for Project Management) | The Owners Perspective   Ã‚  Ã‚   Introduction   Ã‚  Ã‚   The Project Life Cycle   Ã‚  Ã‚   Major Types of Construction   Ã‚  Ã‚   Selection of Professional Services   Ã‚  Ã‚   Construction Contractors   Ã‚  Ã‚   Financing of Constructed Facilities   Ã‚  Ã‚   Legal and Regulatory Requirements   Ã‚  Ã‚   The Changing Environment of the Construction Industry   Ã‚  Ã‚   The Role of Project Managers   Ã‚  Ã‚   References   Ã‚  Ã‚   Footnotes | | | 1. The Owners Perspective Read MoreProject Management in the Energy Industry-Comparing Two Projects3704 Words   |  15 PagesIntroduction Throughout the 1970’s and 1980’s, utility industries struggled to manage their nuclear power plant construction prudently in the public eye. Throughout this era, litigation chastised the mismanaged organizations to the tune of billions of dollars. In this same era, Arab countries declared an oil embargo in the United States, oil prices soared and long lines at the gas pumps reflected the nation’s first fuel shortage since World War II. These events fast tracked the immediate needRead MoreManaging Director ( Sse Hydro ) Essay3964 Words   |  16 Pages(Property Manager) Date: 14th August 2014 Subject: SSE Hydro, Glasgow-Asset Maintenance Policy Issues: 01 INTRODUCTION As the Facilities Manager of Scottish Exhibition Conference Centre Ltd, based in Glasgow. This document will provide a management framework to ensure that these assets are maintained effectively to support the Exhibition Conference Centre. This document also presents a consistent approach to the maintenance of all the assets and defines the roles and responsibilities ofRead MoreDescribe How to Establish Respectful Professional Relationships with Adults52870 Words   |  212 Pages CONSTRUCTING THE TEAM by Sir Michael Latham Joint Review of Procurement and Contractual Arrangements in the United Kingdom Construction Industry Final Report July 1994  © Crown copyright 1994 Applications for reproduction should be made to HMSO First published 1994 ISBN 0 11 752994 X Designed by Design, Drawing and Print Services DEPARTMENT OF THE ENVIRONMENT Foreword Constructing the Team ................................................. v Executive Summary ........