OtherPapers.com - Other Term Papers and Free Essays
Search

The Goofy

Essay by   •  September 15, 2016  •  Term Paper  •  1,966 Words (8 Pages)  •  1,264 Views

Essay Preview: The Goofy

Report this essay
Page 1 of 8

Research appraoc

A deductive approach is used for this thesis. It is built around the literature review, which will present support and evidence on what can be required to be seen in the test  results and statistics. .There are various ways to conduct an event study , but overall approaches are somewhat similar. The structure of the event study used in this theses will be built around the suggestions from Mackinlay,1997. Based on Körner & Wahlgren(2006) census  I test my hypothesis with 95% confidence level since its a standard interval in statistical studies

Event study

The event study methodology is the standard approach when it comes to measuring an impact of an event such as mergers and acquisition on a, It aims to isolate a  specified firms event from the industry-specific events to help quantify and understand how that event affects a firm (Benninga, 2008).In this thesis, the daily stock returns circling the event are examined and compared to the expected return. Any difference that variance from what is to be expected based on the expected return is known as the abnormal return. The statistical approach made is that financial markets including investors are efficient at any point in time so so the event is expected to reflect immediately in the stock price (MacKinlay, 1997).In this event study, The financial market is assumed to main a semi-strong form of the efficiency to be able to detect any changes in the the stock price circling the announcement that. This is in line with evidence fount by fama,1970 and therefore safe to assume that this assumption is valid.

In this thesis, the event is described by the announcement of a merger or acquisition.In order to catch the effect of the acquisition announcements an event window of 3 days will be defined, where the abnormal returns are recorded for each day.Increasing the event window can be problematic as it increases the risk   dilution , as the stocl price might capture other company specific events and risk of clustering which can affect the measurement of the abnormal returns (Tuch & O’Sullivan, 2007). These problems are alleviated  by the usage of a  short  event window while controlling the sample events for clustering throughout  a particular period 

[pic 1] 

To be able to capture the effect of the acquisition announcements, an event window of 3 days will be defined, where the abnormal returns for each day are to be recorded. The event window will start from 2 days before the announcement day, also known as the event day (t0) and end at (t+1) A third day will, therefore, is added to catch the market’s reaction the following day. The reason is to capture in is to obtain any insider information that might slip out prematurely into the markets (Benninga, 2008)

Data

The data obtained are from the secondary sources. A large amount of data were collected   from several databases to be used for the event study and regression. Renowned sources were used to mitigate the risk of incorrect data and measurement error.The database has over 1.5 million deals, which has a good chance of finding enough deals to provide statistical power to the analysis. Nonetheless, the database has proved to have some constraints in regards to what is needed for this thesis. A quite substantial of the deals collected had no data stream code and some without historical prices for the time required.

Thomson one is used as a source for collecting cross -sectional m&a data, as well as characteristics regarding the, characterizes of each deal for the United Kingdom, listed firms.

The delimitations and loss of data is described in figure __

Thomson Reuters DataStream embeded in Microsoft Excel is used to gather the time series data consisting of daily share price, market cap for each cross-sectional deal , as well as market indices. I follow Mackinlay,1997 test by using daily data observations as it increases the power of the statistical test.

Data sample table

The sampling process is made up of into two  steps which are delimitations and data quality. The selection of data is made in Thomson one through a set of criteria.The Sample selection is based on theory and practical reasons. The total m&a  announcements reported on Thomson on is ____ .By adding the time the time constrain (year )    the sample size reduced to  observations ____ .In order to measure the impact on a firm’s value only publicly listed acquirers are included which reduces the sample to ____.Acquisition announcements of both listed (public) and non-listed (private) targets are included which further limits the observations to 4 114.

The deal must be 100% of the targets assets or shares

, which implies that the acquiring company’s ownership must go from a non-controlling position (<50.01%) to a controlling position and further more the deal size must be more that 1m. According to Jarrel & Poulsen 1989  the size of the deal is more important since small acquisitions tend to have less impact on the acquiring frim hence why i added a constraint to the deal size.  The final sample includes ___observations. Deal from industries such as bank, financial investments, credit institutions, and ___ were excluded since those firm are heavily regulated and have different account principles  and also belong to a different sector of an economy

The total sample gathered from Thomson one is made up of_____e announcements where____ are for listed targets, and 356 are for unlisted. targets.

The last  stage of the sample selection is determined by the availability and quality of data.

The first step is that only announced deals where all data is available in Datastream are included. Many companies were listed within a year to the announcement limiting the estimation of normal returns, which is further explained in  the next section

 As a result, these observations are excluded from the sample. Additionally, when controlling for clustering, two observations were removed due to two announced acquisitions by the same company on the same day.

1) Some of the acquisition announcements fall on a weekend, which means that there isn’t an observable value, as the stocks are not traded. The announcement date is therefore moved to the first subsequent trading day.

2) DataStream isn’t able to find some deals or only part of the deals, this isn’t useful and the deals are therefore removed from the sample. DataStream also struggles to find some market indexes. A possible solution to this will be to compare the stocks to an overall Eu- ropean index, but this will not be consistent with the rest of the methodology, and these observations are therefore also removed.

...

...

Download as:   txt (12 Kb)   pdf (176.8 Kb)   docx (15.6 Kb)  
Continue for 7 more pages »
Only available on OtherPapers.com