Stock market trading nowadays has turned into a test of technological prowess among different players involved. Normally, price rises or falls depending on whether demand has gone up or down, respectively. This is the genuine rule of demand and supply as expected anywhere within a market system. However, people seem to embrace computing power over the natural forces that determine how well or bad a market system should behave. In respect of the flash crash that the US market experienced in 2010, there is a need for the mechanism of checks and balances to observe market trends, possible market slumps, authenticity and efficiency of trading algorithms, and any risky behavior that might bring about such tragic events as experienced in 2010.
Stiglitz arguments focus on the adopted means of conducting market businesses around computers. In his arguments and strong belief, there is a growing tendency to let machines do the tedious work on behalf of manual input on the part of individuals. This, he implies, serves to ease work and encourage laziness and increase risks of markets plunging into serious setbacks in the form of flash crashes. Generally, Stiglitz holds firmly onto the idea that overreliance on computing capabilities leaves everything at the mercy of algorithms in use, and in case of any malfunctioning of the highly depended-upon electronic systems, or any malicious action by those running the systems, may prove detrimental to all market operations and subject a nation’s economic state in unstable conditions. Flash trading does not only cause high volatility in market operations but also renders market businesses less informative and passive participation on the part of different players.
In view of modern trends in respect of market trading patterns and behaviors, it is evident that use of technology in stock markets takes little consideration of price discovery concepts. People just trade by watching computers monitor the status of stocks and securities, and even make decisions for investors and other traders with regard to the best options they should pick-either to trade or wait for the right time for the market to become more favorable in order to do so (Mackenzie, 2008). In due course, little information is stored for future reference regarding what investors ought to do, or not to do.
At first, high frequency trading seemed to be the cause of the 26 May, 2010 market crash. The Dow Jones Industrial Average sunk to its highest intraday point loss, but not percentage loss in history, only to pull through much of that trouncing within some few minutes. Moments after the terrible crash, several parties, including groups and individuals came out to strongly defend high-frequency trading from blame, saying it was not the cause of the tragic occurrence at the US market. They said that, in fact, high-frequency trading could have acted in reversing the effect of the crash, helping in the quick recovery after what had happened. Big names of active players in the US stock market conducted their investigations concerning the incident. They came up with conclusions exonerating high frequency trading from any unpleasant influence on the occurrences in the market. They also concluded that high-speed trading could have actually aided the market’s recovery process and stabilization.
In response to what had happened in the market, the Securities and Exchange Commission set out to find out what had actually brought about the market plunge in efforts to find a solution to the financial tragedy caused. This was also done by the Commodity Futures Trading Commission. It took the two commissions over five months after the flash crash to come up with a joint report. In the report, findings indicated that some actions of high-frequency trading firms had significant contribution towards high volatility leading to the tragic event of the market drop.
In the recent past, high-frequency or high-speed trading, which is simply referred to as H.F.T, was the largest new concept to emerge on Wall Street trading, and in the minds of many players and observers, the most disorderly. On any particular day, this super fast, computer-driven way of trading accounts for half of all of the business activity conducted on the country’s stock markets. Detractors claim that High Frequency Trading contributes to shocking flash crashes and computer glitches that seem to stir up the markets with startling frequency.
High Frequency Trading first became a considerable part of the Wall Street view in the 1980s, when it was held responsible for aggravating the market falls in October, 1987. From that moment, the computing machine power involved has grown immensely dominant and the algorithms that steer their trading vastly more sophisticated.
For a number of years, High Frequency Trading firms have operated in the shadows, mostly far from Wall Street, trading stocks at warp speed and gaining billions while criticism went to soaring levels that they were distorting markets and upsetting many ordinary investors in a big way. Recent observations and investigations indicate that they have started to step into the light to beige their reputation with the public, regulators and other investors. Recent studies conducted on the practice give some relieving findings, revealing that the dreaded practice is slowly on a downward progression. Figures show that $1.25 billion profits from high frequency trading in the year 2012 are 35 per cent lower than what was reaped in 2011, and a whopping 74 per cent in 2009. This could be pleasant tidings for those other players who have always lived in fear of high frequency trading activities that seemingly take enormous advantage of their high-tech potential to reap maximally from the markets.
There exists what people call efficient market hypothesis in finance. This hypothesis asserts that a financial market is efficient in terms of information. Consequently, a trader cannot consistently obtain attainable returns in more than the average market returns when a risk-adjusted basis is employed, given that information is accessible at the moment of investing.
Critics blame rational markets for most of the crashes of markets that have occurred in the recent past, especially in the years starting from 2000. On the contrary, those who advocate for the efficient market hypothesis have responded by affirming that market efficiency is only simplification of market operations. They indicate that market efficiency usually does not necessarily mean that there is certainty about what would occur in the markets in the future. They do believe that, practically, markets are efficient for investment reasons.
Plunges in overall trading capacity have made it quite a tricky task to reap maximum profits for traders who hastily trade in the shares offered by slower investors. Furthermore, traditional investors such as mutual funds have taken up the high-frequency industry’s automated techniques while the technological costs of trimming further milliseconds off trade durations has become a serious draw off on many corporations. Technology is a boost in terms of time saving to most traders. However, to the general market, it has serious implications that may bring vastly damaging outcomes that would destroy investment efforts, market implementation strategies and financial bases of most companies, individual investors, regulatory bodies and the national economic power, at large.
For a market to plunge, there are a number of possible causes to it. Some of them include large directional bets and technical hitches. Others may be unprecedented alterations in the market structure plus the impact of high frequency trading activities (Bailey, 2005). These, collectively, were seen to have been among the factors that initiated a series of actions that led to the 6 May, 2010 market crash.
In efforts to reap hugely from markets, high frequency trading firms may choose to exacerbate price declines, as was found to have been the case in the case of the 2010 flash crash. When high speed trading parties conduct their business aggressively to get rid of their positions and withdraw from markets during such times of uncertainty, they expose markets and other slower traders at substantial financial risk in case of plunges in the market. Besides this, sharp changes in the structures anchoring markets are likely to cause problems in the market. In addition, excessive reliance on computer power subjects markets to likely technical glitches that might be injurious to financial transactions in stock markets. For instance, a bug in computer algorithm may have disastrous outcomes. Similarly, traders engaging in large directional bets risk unfavorable results that may come in the form of market plunges (Bailey, 2005).
In spite of all the factors above that might contribute to plummeting of markets, there can significant preventive and recovery measures that may be instituted to save markets in case the worst happens. This may be in the form of legislation whereby regulatory bodies must ensure legislation regulating market operations is adhered to and discipline among market players is maintained at highest levels at all times, by each and everyone. Traders must also observe high levels of integrity and maintain unquestionable degrees of discipline when carrying out their businesses.
Big players in financial markets, such as CME Group, need to respect other investors and small traders in the market. Mischief on the part of significant market participants might cause tremendous repercussions that would most likely hurt their weak counterparts.
Regulatory bodies, under the eyeful watch of the Securities and Exchange Commission, should use their authority and legal powers to ensure that different competing forces in the stock markets engage in clean competition. High-frequency trading companies, in particular, should monitored to ensure they do not carry out any activities that would lead to such terrible occurrences like the sequence of events that led to the 2010 crisis as well as the bad experiences of Wall Street Crisis.
The Securities and Exchange Commission has the mandate to ensure there is total sanity and order in the markets. Under the US legislation, the Securities and Exchange Commission has authority to monitor and regulate market operations. The commission implements policies in respect of this on its own as well as in conjunction with other legal regulatory bodies like Commodity Futures Trading Commission and the NYSE. Collaborative efforts with such established may help realize efficient monitoring the behavior of all players at the markets.
At the financial markets, the procedures must be followed strictly and allow each interested party to know exactly what happens at all times. This would help traders themselves and investors to help regulatory bodies in monitoring and reporting any kind of unacceptable practice on different firms and individuals. Technology techniques adopted, especially by high frequency trading firms must be assessed and authenticated before so that incidents of suspicious technical glitches.
In conclusion, the stock markets can be effective and most stable if both the advantages availed by the efficient market hypothesis and rational markets propositions get considerable acceptance by all players. Adherence to market regulatory guidelines is also another key pillar in ensuring fair play and to offer just competitive ground for all traders and investors. Essentially, some parties may see loop holes in the system and choose to utilize them for their own benefit. To prevent this from taking place, such loop holes must be identified and sealed because their exploitation by a few may jeopardize the hard work of many other players and the entire market. Thus, regulation of financial markets remains to be the main anchor of prosperous market. The concerns of Stiglitz are genuine and need reflect what actually occurs in the financial arena. Markets need to be sufficiently informative. Overreliance on technology should be checked and allow people to engage in some instances to ensure some mean individuals do not manipulate algorithms for their own gain, and at the expense many others.