We are doing a lot of work at Reuters these days to understand and then adapt our services to the increasingly different requirements of human and machine users.

Reuters has a 155-year history of serving the information needs of human beings around the world but Reuters was also early to recognize that in the financial services industry many of the consumers of our data were in fact other machines.

The two co-existed comfortably for many years with feeds of Reuters data being routed to terminals for human display and being drawn upon by a variety of customer applications such as risk management or end-of-day portfolio pricing systems.

However since the beginning of this century the rise of algorithmic trading coupled with the explosion in derivatives and regulatory changes have created a situation in which the needs of humans and their machines are diverging. This in turn has profound user interface as well as systems architecture ramifications for Reuters and our customers.

To provide market data and other information for machine consumption the key attributes are a comprehensive yet extensible data model to represent the information being delivered; rich metadata which describes the underlying content in a way machines can in turn use that content more fully; a published API to make it easy for customers to write their applications to use the provider’s data; a published and extensible symbology set (like Reuters RICs) to allow applications to identify the data and associate it with companies markets and instruments; and finally and of increasing importance raw speed in the form of super low-latency datafeeds.

Back on the human side things have not sat still either. So instead of just a flashing screen of impossibly fast moving news and data customers need increasingly sophisticated analytics to make sense of the torrent of available information; they need intuitive design intelligent search and far better graphics to find and display the information; and they often need a reliable mechanism to slow-down the data and only pass along changes that affect valuation rather than seeing how many times an instrument can update in one second.

In short the separation of so-called "alpha" and "beta" which has been discussed for some time in financial markets has as its corollary a separation of the man and machine involved in its production. It is overly simplistic to say machines produce "beta" (the ability to replicate market risk) and only humans can achieve "alpha" (the ability to outperform the market); however the specialization and separation of tasks between humans and machines is likely to accelerate as processing power continues to increase with new multicore CPUs data mining and other analytical software improves and the performance record of passive index funds and ETFs vs. absolute return funds vs. traditional long-only funds is better understood.

Another facet of this same phenomenon can be seen on the trading floor as machines increasingly replace humans in high-velocity thin-spread markets such as spot FX cash equities and US Treasuries while expensive humans reach for higher spreads in (as yet) less transparent markets such as credit and other derivatives and structured products.

All of this is good news for Reuters as we designed our Core Plus strategy two years ago to capitalize on these trends.