Category Archives: Foreign Exchange Releases

Foreign Exchange News Releases

High-frequency trading: Better for the Investor?

• Over the past years, high-frequency trading has progressively gained a foothold in financial markets, enabled and driven by an interplay of legislative measures, increased competition between execution venues and significant advances in information technology.

• The terms “algorithmic trading” and “high-frequency trading” are frequently mixed up in the public debate. In contrast to traditional trading strategies, high-frequency traders do not aim to establish and hold long-term positions. Rather, they enter into short-term positions and end the trading day “flat”, i.e. without carrying over significant positions to the next business day. Algorithmic trading strategies, on the other hand, typically aim at reducing the adverse market impact of large-sized, institutional orders.

• Strategies employed by high-frequency traders are manifold: They may be differentiated into statistical arbitrage, liquidity detection and liquidity providing strategies (market-making).

• Extraordinarily high-speed and sophisticated quantitative and algorithmic computer programs for generating, routing, and executing orders are of paramount importance for the financial success of high-frequency traders.

• Existing evidence related to the impact of high-frequency trading on certain market quality and efficiency indicators is, as of now, inconclusive: while high-frequency traders provide liquidity to the market and contribute to the price formation process, some market participants feel themselves to be at a disadvantage by being unable to keep up with the necessary investments in trading technology.

• In light of the growing importance of high-frequency trading and its allegedly harmful effects in the event of adverse market conditions, regulators are currently putting strong emphasis on subjecting high-frequency trading to prudential and organisational requirements and to supervision by a competent authority.

High-frequency trading has been a focus of considerable public and regulatory attention since May 6, 2010, when financial markets were given a drastic wake-up call by what later became known as the Dow Jones “flash crash”. Although a subsequent investigation by the SEC cleared high-frequency traders of directly having caused the flash crash, what could be observed that day were the effects of the evolution of the financial markets and the interplay of regulation, competition and technology.

This research briefing aims to shed light on the current developments in the financial markets, the repercussions of the flash crash and on the conclusions drawn by regulators. We will look at the pros and cons of high-frequency trading from an economic perspective and on the potential future prospects of this business model.

The big picture

Both in the US and in Europe, comprehensive pieces of legislation were passed in the years preceding the crisis (re-)regulating the securities markets. In Europe, the Markets in Financial Instruments Directive (MiFID) is the cornerstone of securities markets regulation. Applicable since November 2007, MiFID fosters greater competition in the provision of services to investors and between trading venues in order to contribute to deeper, more integrated and liquid financial markets. MiFID’s counterpart in the US is the Regulation National Market System (RegNMS) of 2005, designed to strengthen the regulatory structure of US equity markets. RegNMS fosters both competition among individual markets and competition among individual orders in order to promote efficient and fair price formation across securities markets.

The events on May 6, 2010:

The so-called “Flash Crash” was a brief period of extreme market volatility on May 6, 2010. That day, the Dow Jones Industrial Average (DJIA) fell by 998.5 points within seconds, which marks the biggest one-day decline on an intraday basis in the history of DJIA stock index.

At the same time, substantial developments in information technology (IT) have spurred an electronic revolution, enabling market participants’ remote access to multiple execution venues without the need for physical presence, which ultimately led to an “arms race” for the most effective deployment of IT. IT has also been used to increasingly automate the order-execution process.

But why has this fostered the emergence of high-frequency trading? Breaking the monopoly of exchanges by ending the so-called “concentration rule” (in Europe) or by providing intermarket price priority for quotations (in the US) introduced more competition. Exchanges reacted to the increased competitive pressure by modifying their fee schedules, which not only meant a de facto reduction of fees for investors but also facilitated entirely new tariff structures. Pricing mechanisms such as maker-taker pricing1 contributed to a notable decrease in frictional costs for small trades. Over time, the employment of simple algorithms for straightforward order-execution tasks became standard procedure, which is evidenced by the drastic decrease in the average trade sizes on major stock exchanges over the last years: on the NYSE, for instance, the average trade size is now only 200 shares, down from 1,600 shares fifteen years ago; the average value of an order has fallen to USD 6,400 from USD 19,400 five years ago. In addition, the new tariff structures increasingly spurred the development of more sophisticated algorithms to work off more complex order management tasks.

Hierarchy of terms

Beginning in the late 1990s, the electronification of execution venues enabled market participants (banks, brokers and their institutional and retail clients) to remotely access electronic order books. Electronic trading refers to the ability to transmit orders electronically as opposed to via telephone, mail, or in person. Since most orders in today’s financial markets are transmitted via computer networks, the term is rapidly becoming redundant.

Algorithmic trading (AT) is more complex than electronic trading; it is an umbrella term which does not necessarily imply the aspect of speed typically connoted with HFT. Algorithms were originally developed for use by the buy-side to manage orders and to reduce market impact by optimising trade execution once the buy-and-sell decisions had been made elsewhere. Hence, AT may be defined as electronic trading whose parameters are determined by strict adherence to a predetermined set of rules aimed at delivering specific execution outcomes. Algorithms typically determine the timing, price, quantity, and routing of orders, dynamically monitoring market conditions across different securities and trading venues, reducing market impact by optimally and sometimes randomly breaking large orders into smaller ones, and closely tracking benchmarks over the execution interval (Hendershott et al., 2010).

High-frequency trading (HFT) is a subset of algorithmic trading where a large number of orders (which are usually fairly small in size) are sent into the market at high speed, with round-trip execution times measured in microseconds (Brogaard, 2010). Programs running on high-speed computers analyse massive amounts of market data, using sophisticated algorithms to exploit trading opportunities that may open up for milliseconds or seconds. Participants are constantly taking advantage of very small price imbalances; by doing that at a high rate of recurrence, they are able to generate sizeable profits. Typically, a high frequency trader would not hold a position open for more than a few seconds. Empirical evidence reveals that the average U.S. stock is held for 22 seconds.

Strategies

Over time, algorithms have continuously evolved: while initial first-generation algorithms – fairly simple in their goals and logic – were pure trade execution algos, second-generation algorithms – strategy implementation algos – have become much more sophisticated and are typically used to produce own trading signals which are then executed by trade execution algos. Third-generation algorithms include intelligent logic that learns from market activity and adjusts the trading strategy of the order based on what the algorithm perceives is happening in the market.

HFT is not a strategy per se but rather a technologically more advanced method of implementing particular trading strategies. The objective of HFT strategies is to seek to benefit from market liquidity imbalances or other short-term pricing inefficiencies.

Liquidity-providing strategies mimic the traditional role of market makers – but unlike traditional market makers, electronic market makers (liquidity providers) have no formal market making obligation. These strategies involve making a two-sided market aiming at profiting by earning the bid-ask spread. They have been facilitated by maker-taker pricing models and have evolved into what is known as Passive Rebate Arbitrage. As much of the liquidity provided by high frequency traders (HFTs) represents “opportunistic liquidity provision”2, the entering and exiting of large positions is made very difficult.

Pursuing statistical arbitrage strategies, traders seek to correlate prices between securities and to profit from imbalances in those correlations. Subtypes of arbitrage strategies range from arbitrage between cross-border or domestic marketplaces to arbitrage between the various forms of a tradable index (future or the basket of underlying stocks) and so-called cross-asset pairs trading, i.e. arbitrage between a derivative and its underlying.

In terms of liquidity detection, traders intend to decipher whether there are large orders existing in a matching engine by sending out small orders (“pinging”)to seek for large orders. When a small order is filled quickly, there is likely to be a large order behind it.

Players

High-frequency traders are mainly proprietary traders. This means they utilise their own capital for trading activities and do not usually conduct HFT on an agency basis. The use of extraordinarily high-speed and sophisticated quantitative and algorithmic computer programs for generating, routing, and executing orders is absolutely decisive. It requires speedy market data delivery from trading centre servers to the servers of the HFT firm; speedy decision processing in the HFT firm’s trading engines; speedy access to trading centre servers; and speedy order execution by the trading centres.

Hence, an extremely low latency, which is defined as the time which passes between the entry of an order until it is executed and the transaction is processed, is vital for HFTs. Market operators offer speed-sensitive market participants the installation of their trading engines directly adjacent to their own infrastructure. This co-location of servers in close physical proximity to the market operator’s systems minimises network latencies3 between the matching engine of the trading venue and the servers of the market participants.

Further characteristics of HFT firms include real-time data analysis in order to produce automatic trading decisions and very short time-frames for establishing and liquidating positions, resulting in the submission of numerous orders that are often cancelled shortly after submission (cancellation rates of greater than 80% are not uncommon). Moreover, HFT firms end the trading day “delta-neutral”, i.e. in as close to a flat position as possible without carrying significant, unhedged positions over-night.

Impact analysis

Existing evidence related to the impact of HFT on certain market quality and efficiency indicators is inconclusive. Some studies (e.g. Hendershott and Riordan, 2009; Jovanovic and Menkveld, 2010) suggest that HFT using market making and arbitrage strategies has added liquidity to the market, reduced spreads and helped align prices across markets. While there is no proof of a negative liquidity impact in the academic literature, certain issues still remain:

• HFs are under no affirmative market making obligation, i.e. they are not obliged to provide liquidity by consistently displaying high-quality, two-sided quotes. This may translate into a lack of available liquidity, in particular during volatile market conditions.

• HFTs contribute little to market depth due to the marginal size of their quotes. This may result in larger orders having to transact with many small orders and may affect overall transaction costs.

• HFT quotes are barely accessible due to the short duration for which the liquidity is available when orders are cancelled within milliseconds.

Another interesting issue is whether HFT contributes to the price formation process on equities markets. In this context, Brogaard (2010) examines a large data set of HFT firms trading on Nasdaq and finds that, firstly, HFTs add substantially to the price formation process as they tend to follow a price reversal strategy (irrespective of whether they are supplying liquidity or demanding it), driven by order imbalances, and so tend to stabilise prices. Secondly, HFTs do not seem to systematically front-run4 non-HFTs. They provide the best bid and offer quotes for a significant portion of the trading day, but only around a quarter of the book depth (as do non-HFTs) and reduce their supply of liquidity only moderately as volatility increases. Thirdly, HFTs engage in a less diverse variety of strategies than non-HFTs, which may exacerbate market move-ments if HFTs use similar trading strategies. Fourthly, while in principle high cancellation rates could impact the smoothness of execution in markets where HFTs are present, prevailing narrow spreads seem to suggest that cancelled quotes are quickly replaced by other market participants. Hendershott and Riordan (2009) find that algorithmic traders’ quotes play a larger role in the price formation process than human quotes. Summing up, on the one hand, price discovery benefits from market participants who quickly detect anomalies in market prices and correct them. On the other hand, HFT may distort price formation if it creates an incentive for natural liquidity to shift into dark pools as a way of avoiding trans-acting with ever-decreasing order sizes.

In terms of market volatility, neither Hendershott and Riordan (2009) nor Brogaard (2010) find any evidence for a detrimental impact of either AT or HFT.

Economic perspective and potential regulatory aspects

In the currently ongoing debate on how to regulate HFT, the question arises whether this practice collides with the economic functions of a financial market. Originally, the electronification of exchange trading led to a democratisation of this business: retail investors benefitted from equally quick access to markets as professionals and to lower transaction costs. Today, however, special arrangements such as co-location services to reduce latency or the provision of special trade data feeds give preference to the needs of HFTs. Unable to make similar investments in trading technology, other market participants raise concerns that they are at a disadvantage. They also fear that HFTs can execute orders and hit liquidity ahead of them. Moreover, (sub-penny) arbitrage, where HFTs buy and sell stock purely to collect rebates, is often criticised as bringing no value to the (retail/long-term) investor.

These concerns seem to be partially justified: unlike registered market makers, HFTs have neither the obligation nor incentive to continue to provide liquidity to the market in the event of adverse market conditions. This means they are able to withdraw liquidity at any time. Yet, it must also be acknowledged that in normal market circumstances, HFTs do increasingly provide liquidity to the market (“artificial volume creation”) that would otherwise not be available, easing the pressure on supply and demand. In consequence, spreads have been narrowed (and are kept narrow), benefitting both retail and institutional investors.

In view of these developments, the European Commission, in its review of the MiFID framework directive, intends to subject HFT to MiFID requirements and to supervision by a competent authority. The Commission proposes to make sure that all persons involved in HFT above a minimum quantitative threshold are obliged to be authorised as investment firms and would therefore be subject to full regulatory oversight and to a number of organisational prerequisites such as risk management obligations and capital requirements. In addition, the Commission intends to introduce amendments to MiFID related to the provision of liquidity by HFTs: According to these plans, operators of regulated markets would be required to ensure that a HFT firm continues to provide liquidity on an ongoing basis subject to conditions similar to those applicable to market makers, if it executes a significant number of trades in a certain instrument. In terms of order persistence and tick sizes, operators of regulated markets may be required to ensure that orders remain in the order book for a minimum period before being cancelled – or alternatively to ensure that the ratio of orders to transactions executed by any given participant would not exceed a specified level. Implementing measures could further specify minimum tick sizes that would generally apply to all trading, not just automated trading.

Conclusion and outlook

The events around the May 6 flash crash have shown that equity markets may be vulnerable to strategies facilitated by the latest evolutions in trading technology. Hence, regulators both in the US and in the EU are reacting to this potential threat by subjecting HFTs, to the extent they are not already, to prudential and organisational requirements and to full regulatory oversight by a competent authority. The European proposals are closely related to those in the US, but are still – in the context of the MiFID review – in their early stages. Certain suggestions seem to be reasonable: One of them is the Commission’s suggestion to require that co-location facilities need to be offered on a non-discriminatory basis. This is sensible on the grounds of maintaining competitive neutrality, but may be difficult to put into reality, as physical capacity for co-location is naturally limited.

Other proposals appear more problematic: For instance, the suggestion to require HFTs to provide liquidity on an ongoing basis may seem reassuring at first sight; however, it would expose market making firms to price risk in times of crashing markets, which would have an adverse rather than a positive impact on financial stability. Instead of imposing new obligations on market makers, a better way to help prevent market failures would be to implement a market-spanning framework of carefully designed safeguards (so-called volatility interruptions in the EU, circuit breakers in the US) which halt the market during market breakdowns, provide the opportunity for participants to cool down and to then re-open trading at new equilibrium prices. In addition, exchanges and other trading platforms could be required to test high-frequency and algorithmic trading programmes before they are used in the markets.

References

Aldridge, I. (2010). High-Frequency Trading: A Practical Guide to Algorithmic Strategies and Trading Systems. John Wiley & Sons.

Brogaard, J. (2010). High Frequency Trading and its Impact on Market quality. Northwestern University Kellogg School of Management Working Paper. September 2010.

Hendershott, T., James, C.M., and A. Menkveld (2010). Does Algorithmic Trading Improve Liquidity? Journal of Finance.

Hendershott, T., and R. Riordan (2009). Algorithmic trading and information. NET Institute Working Paper No. 09-08.

Jovanovic, B., and A. Menkveld (2010). Middlemen in limit order markets. Working Paper. New York University. 2010.

Institutional Investors Allocating to CTAs

Agecroft Partners See Institutional Investors Finally Allocating to CTAs

Agecroft Partners is seeing widespread demand from institutional investors for Commodity Trading Advisors (CTAs), which is a dramatic change from the historical reluctance of many institutional investors and their consultants to allocate to this strategy. Over the past 10 years, the number of pension funds allocating to hedge funds has steadily increased, along with the percent of their average portfolio allocation. While many of the major hedge fund strategies, such as Equity Long/Short, have been widely accepted by institutional investors for many years, CTAs have only recently been accepted as a core hedge fund allocation as pension funds scramble to enhance their returns and reduce the volatility of their portfolios after the market correction of 2008.

CTAs are managers that typically use a quantitative trading program to generate momentum or trend-based returns by going long or short futures contracts in areas such as commodities, foreign currencies, equity indexes, fixed income and interest rates. Before 2009 very few pension funds allocated to CTA despite the fact, the strategy represents approximately 15% of the hedge fund industry and is one of the oldest and most regulated of all hedge fund strategies. In addition, CTAs have been one of the only strategies to generate returns that have been negatively correlated to equity benchmarks. The reason institutional investors avoided CTAs was because they could not understand how the systematic models worked nor could they evaluate which firm’s model was superior. Most large institutional investors were more comfortable with fundamentally driven investment strategies that sounded similar to their long only managers’ investment processes.

After the 4th quarter of 2008, pension funds realized that their portfolios were not as diversified as they previously believed. Investment committees were shocked when they received their 2008 year end performance report which showed correlations of performance between their managers had risen significantly. Many saw their emerging market equity managers down over 50%, their US equity managers down in the mid 40% range, high yield mangers down close to 30%, the DJ-UBS Commodity Index down 35% and the average hedge fund manager down in the high teens. While all this carnage was going on in their portfolios, the Barclays CTA index was up over 14%, which began to get institutional investors attention.

Over the next year, many institutional investors along with their consultants began to take a closer look at CTAs and discovered that not only did they do well in 2008, but the average CTA was also up in 2000, 2001 and 2002, which in addition to 2008 were the last 4 years the S&P had posted negative returns. Some CTAs also exhibited a historical dynamic correlation to the equity markets where they were positively correlated in up equity markets and negatively correlated in down markets, which was driven by their systematic models that are designed to make money based on both bull and bear trends in markets. This means that correlations under explain the diversification benefits of these trend following CTAs. A better statistic to look out is their correlation in down markets minus their correlation in up markets.

The 4th quarter of 2008 also helped to accelerate the demand for CTAs as many institutional investors enhanced their due diligence process after the Madoff fraud was exposed and the unpleasant experience of many hedge fund managers raising gates on withdraws or suspending redemption due to a mismatch in fund liquidity provisions terms and the underlining investments. During this period, CTAs were fully able to offer full liquidity to their investors and to accurately value their portfolios at all times. These events put a greater focus on liquidity, separate account management, transparency and risk management. CTAs in general trade only in highly liquid, price-transparent futures and currency contracts and typically allow their investors monthly liquidity. Some CTAs have even begun to offer weekly or daily liquidity. While many hedge funds are reluctant to take on separate accounts, CTAs typically welcome separate accounts and offer full transparency of the underlying securities. Many of the leading CTAs are mature and well-developed business, and so are able to offer an institutional infrastructure with large teams in research, technology, operations and legal/compliance.

These attributes are all extremely compelling to institutional investors which have lead to a large increase in their allocations to CTAs. Some of the notable institutions that have allocated to CTAs include: Texas Teachers Retirement System, New Jersey Public Employees’ Retirement System and Eastman Kodak Co. Once investment trends begin within the pension fund industry they tend to last for a very long time, where the largest funds act as first movers and the rest of the industry slowly replicates the decisions by the industry leaders. This same phenomenon is also taking place within the endowment and foundation market place.

The big question left with institutional investors is how to differentiate between over a thousand CTAs in the market place. Over the past 2 years, a majority of institutional investors have chosen to make their initial CTA allocation with a small number of the largest managers in the industry, which has caused these CTA firms’ assets to swell significantly. The major problem with hiring mangers that have experienced such rapid growth is that you can’t participate in the historical performance before you invested in the fund. Although many of these managers state they continue to trade across a large number of markets, the fact is that the least liquid, but more diversifying markets may represent a smaller and smaller percent of their portfolio as they grow. Some may even alter their investment process or reduce the level of targeted volatility and returns in order to expand capacity.

Institutional investors should consider using multiple evaluation factors to select the appropriate CTA manager which include asset under management, size of research team, organizational infrastructure, research process, historical performance and risk controls. Instead of allocating to the largest managers, investors might want to consider investing in managers within the mid-sized asset range of $2 billon to $10 billion. This asset size is big enough to support a substantive research team, but not too large where their alpha may be diluted over a growing asset base. One of the keys to running a successful CTA over time is to constantly enhance proprietary models, because if left static eventually the models’ effectiveness erodes, causing a decline in performance. As a result, it is important to hire a CTA with a well built out research team. In evaluating their research process one of the key factors to focus on is how much transparency into the process they give investors. Since CTA models and organizations tend to evolve over time, it is important to put greater emphasis on more recent short and medium term performance when examining a firm’s historical track record. Another strategy gaining popularity among institutional investors is hiring a small basket of CTAs since individual CTAs can be volatile and are hard to differentiate between; many investors prefer to spread their CTA allocation over smaller investments with 2 or 3 managers. The benefits are capturing the uncorrelated returns, while reducing the potential drawdowns in performance.

In summary, CTAs have come a long way over the past 2 years in gaining credibility with institutional investors. These investors have been drawn to their uncorrelated historical return stream with long only benchmarks, the institutional structure of the leading firms in the strategy, the high level of liquidity in the underlining investments and fund term, ease of creating a separate account and most importantly strong historical performance. Agecroft Partners believes that we are in the initial stages of seeing significant assets flows from institutional investors into this strategy.

Source: White paper published by http://www.agecroftpartners.com/papers-CTAs.html

Currencies Quants Research

EPQRG currently engages in a quant research and developemt of quants trading currencies in the interbank (“spot”) markets and options on futures.

Currency Quants Trading Program are 100% systematic and are based on quantitative analysis – a statistical concept. It is 50% statistical arbitrage and 50% position size management. No classical technical or fundamental analysis is used, no pattern recognition techniques and no trading rules based on trader’s experience (discretionary trading). Unlike most trading systems, which attempt to predict market direction, EPQRGs trading model reacts to price action and makes trading decisions.

This particular research is rooted in the notion that guessing market direction is impossible, as proven by many academic studies.

The primary focus in this research help overcome the obstacle many quantitative research groups face of having to translate code from the development environment to the trading environment. These groups are often vulnerable to possible errors in the code translation as well as delays that can render a winning strategy obsolete before it goes into production.

quant-research

Through EPQRG, our clients can manage the whole strategy life cycle within one integrated platform and also automate execution across a range of asset classes and markets on multiple continents. For firms already able to develop and trade in the same environment, the new quant vastly increases the markets on which they can capture opportunities.

High frequency trading system design & management

Trading firms are highly reliant on data mining, computer modeling and software development. Financial analysts perform many similar tasks to those in software and manufacturing industries. However, the finance industry has not
hft yet fully adopted high-standard systems engineering frameworks and process management approaches that have been successful in the software and manufacturing industries. Many of the traditional methodologies for product design, quality control, systematic innovation, and continuous improvement found in engineering disciplines can be applied to the finance field.

Market and technological knowledge acquired from engineering disciplines can improve the design and processes management of high frequency trading systems. High frequency trading systems are computation-based. These systems are automatic or semi-automatic software systems that are inherently complex and require a high degree of design precision.

The design of a high frequency trading system links multiple fields, including quantitative finance, system design and software engineering. In the finance industry, where mathematical theories and trading models are relatively well researched, the ability to implement these designs in real trading practices is one of the key elements of an investment firm’s competitiveness. The capability of converting investment ideas into high performance trading systems effectively and efficiently can give an investment firm a huge competitive advantage.

Our development and planing process is composed of high frequency trading system design, system modeling and principles, and processes management for system development. Particular emphasis is given to backtesting and optimization, which are considered the most important parts in building a trading system. This research builds system engineering models that guide the development process. We do also use experimental trading systems to verify and validate principles developed.

The systems engineering principles, frameworks and strategy development can be the key to success for implementing high frequency trading or quantitative investment systems.

Can any currency be converted into any other currency?

That depends which currency you hold and which currency you wish to acquire, but in general, this should be possible. Consider, as an example, that an investor holds euros (EUR) and wishes to convert those to Australian dollars (AUD). This investor could purchase AUD directly with their EUR; they would not have to first sell their EUR into U.S. dollars (USD) and then use the USD to purchase AUD (although they may choose to do so).

Currency prices are most commonly quoted versus the USD in the U.S., but there are many actively traded currency pairs that do not involve the USD, and many of these exhibit very high levels of liquidity. For example, the euro/British pound (EUR/GBP), euro/Swiss franc (EUR/CHF), and Australian dollar/Japanese yen (AUD/JPY) are all highly liquid currency pairs, to name but a few.

Generally speaking, the G10 currencies are very liquid and readily convertible from one to another. Some, less frequently traded currency pairs may exhibit lower liquidity and hence larger spreads and therefore transaction costs. The decision whether to convert directly, or to first sell into USD, then use the USD to purchase the desired currency, is largely dependent upon the level of liquidity the currency pair exhibits.

An exception to this is when a currency is non-convertible, such as the Chinese yuan (CNY). In a future Currency Corner we will discuss non-convertible currencies in more detail.

Foreign Exchange Market

The foreign exchange market (forex, FX, or currency market) is a global decentralized market for the trading of currencies. The main participants in this market are the larger international banks. Financial centers around the world function as anchors of trading between a wide range of different types of buyers and sellers around the clock, with the exception of weekends. EBS and Reuters’ dealing 3000 are two main interbank FX trading platforms. The foreign exchange market determines the relative values of different currencies.

The foreign exchange market works through financial institutions, and it operates on several levels. Behind the scenes banks turn to a smaller number of financial firms known as “dealers,” who are actively involved in large quantities of foreign exchange trading.  Most foreign exchange dealers are banks, so this behind-the-scenes market is sometimes called the “interbank market,”although a few insurance companies and other kinds of financial firms are involved. Trades between foreign exchange dealers can be very large, involving hundreds of millions of dollars.Because of the sovereignty issue when involving two currencies, Forex has little (if any) supervisory entity regulating its actions.

The foreign exchange market assists international trade and investment by enabling currency conversion. For example, it permits a business in the United States to import goods from the European Union member states, especially Eurozone members, and pay euros, even though its income is in United States dollars. It also supports direct speculation in the value of currencies, and the carry trade, speculation based on the interest rate differential between two currencies.

In a typical foreign exchange transaction, a party purchases some quantity of one currency by paying some quantity of another currency. The modern foreign exchange market began forming during the 1970s after three decades of government restrictions on foreign exchange transactions (the Bretton Woods system of monetary management established the rules for commercial and financial relations among the world’s major industrial states after World War II), when countries gradually switched to floating exchange rates from the previous exchange rate regime, which remained fixed as per the Bretton Woods system.