Tag Archives: latency

LMAX disruptor framework and whitepaper

This is really old news now as I’m very late in posting it but since I’m still coming across people who have remained blissfully unaware I thought this was worth re-iterating. If you haven’t come across this yet drop everything else and read about the LMAX Disruptor framework and the associated whitepaper titled Disruptor: High performance alternative to bounded queues for exchanging data between concurrent threads. There is also an associated (and rather dated now) infoq presentation titled How to Do 100K TPS at Less than 1ms Latency.

In the beginning there was a main thread of execution, then came two and then thousands. Once we had scaled to starvation with threads came SEDA and the concept of queues, hierarchical topologies of queues and lots of writers and readers operating on queues with threads now relegated to second class citizen status. For a while the industry rested in the assurance that it had achieved equilibrium with innovation on latency. Then – out of the blue LMAX happened. LMAX (London Multi Asset eXchange) are the highest performance financial exchange in the world.

Read the whitepaper to find out just how outdated conventional wisdom on concurrent queuing in Java actually is and how a lack of awareness of how your financial code performs end-to-end hardware to VM could be created bottlenecks for your platform. The essence of the disruptor framework is a strikingly simple concept but at the same time profound not only in its effectiveness in attaining its goal – reducing latency – but also in the extent to which it leverages knowledge of hardware and the java virtual machine that it runs on.

It proves wrong beyond doubt the rather outdated mindset that questions employing Java for financial low latency use cases. Ever since Java 5 and particularly Java 6 – the JVM has dwarfed the Java language in its importance, capabilities and scope and as a result utilising Java is now fundamentally synonymous with utilising the JVM which is what makes the language so compelling.

It isn’t about the code that you write. It’s about the code that’s interpreted and then runs natively. It is naive to consider only the language as many seem to be doing in the light of the imminent release of Java 7. It’s important to bear in mind that whilst language sugar is important if runtime matters to you then you’ll want to focus on: (1) the VM (2) writing wholly non-idiomatic Java and (3) opposing conventional wisdom at every level of abstraction every step of the way.

Tech arms race in the Tron landscape

On the train, back to the real world, this evening for another year at work I came across a fascinating article in the New York Times titled ‘Electronic Trading Creates A New Trading Landscape – The New Speed Of Money Reshaping Markets‘. For the duration of that journey I was wholly engrossed in the article and the radial thought processes it triggered effortlessly and constantly on technology and finance. It was an inspiring read and one that made me glad and relieved that I happened to work in this industry.

Predominantly it talked about how, over time, smaller exchanges (such as Direct Edge) had reclaimed the overwhelming dominance and market share of the historic exchange duopoly of NASDAQ-NYSE and how, during that process, New Jersey had been transformed to become ‘the heart of Wall St’ through the placements of data centres within it that now host and operate some of the largest stock exchanges in the US. The charming reference to a ‘Tron landscape’ was made in the article based on the likeness of the blue phosphorescent lighting used in the datacentres for illumination to that in the film.

More interesting to me, however, was the story of how this progression had been driven from its core by the breakneck speed and sheer extent of technological automation, advancement and innovation leaving traders, regulators and the market struggling to keep up in its trail. So where are we now? Exchanges are distributed, leaner and more competitive. Through colocation, software advancement, closer proximity to targets and with new fibre optic pathways constantly being laid between critical geographic exchange data routes trading is faster. Through high frequency trading, dark pools and strategic algorithms – trading is more intelligent allowing arbitrage and price exploitation through micro trading under stealth.

What have been the consequences of such advancements over time however? The use of HFT to place a very large bulk order in small increments was found to be the root cause of a market crash last May when this particular HFT algorithm continued placing trades as part of a large order despite prices sinking part way through. As a result the SEC and the exchanges introduced a halt to trading on individual stocks if the price fell more than ten percent in a five minute period. Dark pools have been in the spotlight for being opaque and exempt from public scrutiny. And there is talk of regulation not only of data centres and colocation but of perhaps technology’s greatest achievement of all – speed. The unattended and perahps ill-considered advancement of technology for mostly selfish motives has resulted in a disproportionate loss of control, transparency and ethical considerations away from human discretion and towards machine code. Can technology continue to dominate this industry progression at its core to its advantage or will it become the very victim of its own success? I wonder where we go from here. What do you think?