Skip to main content

Archives

Reuters technical development chronology: Retrospective

Some concluding thoughts on the story so far.

ARCHITECTURE AND DESIGN

Reuters’ venture into computing began in the 1960s with Stockmaster, Customprice, then Monitor. In that era, international communications lines were slow (e.g. 150 bits per second) and expensive. Mainframe computers occupied huge halls.

By the late 1960s the minicomputer challenged this classic architecture. Centrally-hosted applications connected to dumb screens could now be superseded by a modular, distributed approach where the network and the application can be shared between a number of interconnected smaller and cheaper minicomputers. This approach also helped to reduce project risk on capital expenditure.

In the early 1970s a number of London-based software houses were starting to propose minicomputer-based solutions for real time applications, like message switching, using the Digital PDP range.

One of them (Scicon) was chosen to develop the original Monitor system for Reuters. (Interestingly the project manager and lead designer assigned to the project by Scicon later joined the Instinet subsidiary and played a similar role in their first generation systems.)

The design for Monitor exploited the minicomputer to advantage as the main architecture called for relaying data rapidly from source to client, with relatively little "added-value" processing in between. Horizontal expansion of the system, by replicating components, was foreseen making the addition of clients and geographies easier than with a centralised model.

From the 1970s until the end of the 1980s, technical gaps in the proposed design for a new product or product enhancement or problems of efficiency and cost of available technology could be resolved by building a solution in-house. There were countless examples, particularly in communications, where this was true. It was also true that Reuters’ developers were ahead of the suppliers in understanding new technologies that were relevant to the design problems posed by high availability real time systems.

An architecture that could cater, cost effectively, for increasing traffic rates and for the connection of new customers fuelled the growth of the company. Monitor, Dealing and IDN exceeded their original design targets by orders of magnitude. The harnessing of mini-computers, network skills, communications expertise and a deep knowledge of what it meant to design for availability, speed and extensibility was a key success factor for Reuters in that era.

A major pillar of the architecture in the 1980s and 1990s was the idea of an “open systems” interface to which clients could connect their own systems. This included the revolutionary concept of logical data which clients could use in their own applications without complex translation. What was not always taken on board was careful separation of the application from the network and the client. In the interests of speed to market, this was overlooked. It did not seem to matter too much at the time but as the application processing and the data types carried became more complex, greater agility in the architecture was called for to enable upgrades and changes to be deployed throughout the network and the client connection points.

The design weaknesses were recognised, but as time passed the implementation in a very distributed network became more difficult. So, essential changes were increasingly delayed or fudged. Backward compatibility became an issue and the increasing spaghetti started to affect product quality and time to market.

The architecture did cope with the floods of market data, year in, year out, but it did not give much breathing space to consider how to remedy weaknesses and deal with them.

the nice thing about standards is that you have so many to choose from

Professor Andrew Tanenbaum

By the 1990s, the Internet emerged as a major factor and greater emphasis was placed on “buy don’t build”, that is, to use off-the-shelf components from suppliers. Also application processing was becoming much more important. This in turn required even more agility in the Reuters distributed architecture to adapt to rapid change. Other information vendors like Bloomberg were operating a model with a central mainframe and relatively simple closed terminals, connected via a general purpose communications network. This approach had the advantage that application processing was carried out at one central point, rather than distributed. It was thus much easier to implement change.

As “buy don’t build” increased in importance, and more reliance was placed on suppliers, it became much harder to maintain adequate control over the architecture, which became a function of the suppliers’ own ideas and products.

Further, in addition to real time distribution of information and sophisticated client site offerings, the product designs called for large databases of historical information to become part of the overall architecture. There was little in-house expertise available on the subject and so the early attempts were fraught with problems. It took over 10 years to recover on this particular topic.

The organisation for product definition and development was also distributed internationally and often under geographical control with “dotted line” reporting on technical direction. While this helped satisfy local market requirements it made it difficult to define and enforce the uptake of technical standards, like common application programming interfaces.

In an increasingly complex world conformity to internal standards is essential to achieve swift change. But, as Professor Andrew Tanenbaum, grandfather of Linux, observed, “the nice thing about standards is that you have so many to choose from”. So careful choice is key.

UNDERSTANDING THE CUSTOMER AND THE PRODUCTS

There was often a lack of deep understanding of what the client really needed, perhaps a consequence of the golden era when Reuters dictated product direction, or, when this was understood, an inability to organise to deliver it.

In previous years, the clients (grudgingly) admired what we could give them but this was no longer the case. In some cases technical thinking dominated weak product thinking with the result that an unusable product was released. User unfriendliness became a problem courtesy of an adherence to industry standards and a fascination with the Microsoft way.

A fundamental should have been a deep understanding of data rather than constant addition of minor new features and a glossy appearance. 

Perhaps we could have been less insular and worked more closely with customers. Sometimes we bought in partial understanding of clients' needs via acquisition, inevitably encountering some charlatans on the way.

In the 1970s and 1980s Monitor and Quotes services were relatively simple to understand. They were straightforward to demonstrate, sell and instal. With the development of technology and increased integration, customers expected their more complex problems to be addressed, including position keeping and risk analysis. Sales people had to be much more aware of the customer environment and how our products worked in that environment.

ACQUISITIONS

Many companies were acquired in the 1990s, often in an attempt to rapidly enhance the product line. It is not clear that we got the hang of managing acquisitions. Some were extremely successful, like Instinet, TIBCO, and Effix all of whom operated largely independently.

Others were much less successful. It was easy to understand the acquisition of Quotron, Bridge and Telerate in order to buy market share. However, there was considerable behind-the-scenes effort to integrate them with Reuters and achieve the savings promised when the company was acquired. Savings which often proved impossible to realise.

Acquisitions often came with a different approach to technical matters, which was irresistible to some, but impossible to easily and quickly integrate. It was then easily forgotten that this very approach had resulted in the acquisition being for sale in the first place. It all led to a great deal of time-wasting debate, with changes being introduced for no good reason. 

Things became most complex at the client site. We had a range of PC and Unix-based terminal devices. The work of development groups in Paris, Sydney, Colchester, London, Hong Kong, Chicago, and New York had to be combined to form a consistent customer product. This was always a struggle.

GENERAL TRENDS AT THE END OF THE 1990s

A large amount of effort was spent on consolidating and upgrading mainstream products to keep up with unavoidable external change. The curse of legacy. The design of IDN started to move towards a very distinct “by interest” distribution so that updates were only delivered where required. This moved away from the “all data everywhere” model which had made cost-efficient design in some geographies so complex. Satellite distribution permitted delivery of popular updates to all sites at low marginal cost and finally came of age with a distribution bandwidth which made economic sense.

Acquisitions were drawing valuable technical effort away from mainstream systems on an increasing scale. Some decisions regarding third party products were imposed with technical due diligence being merely a fig leaf for a management decision that had already been made. Whilst acquisitions were an easy way to enhance the product line, in reality they could, over time, weaken architectural coherence and complicate the roll-out of enhancements. Product and technical spaghetti were the outcomes leading to change paralysis and lack of manoeuvrability.

In previous years, clients (grudgingly) admired what we could give them but this was no longer the case. In some cases technical thinking dominated weak product thinking with the result that an unusable product was released

The Internet had entered mainstream thinking and the industry started to understand and solve the design problems of real time distribution of complex data types. The timing of their introduction to mainstream product development in Reuters was complicated. Even the simple act of updating information on a screen had not been properly solved by the industry at this point. Even so, some local products were developed and deployed, so it is incorrect to say that the Internet had not been embraced. Perhaps it was not adapted quickly enough in appropriate areas of mainstream product development. This all seems inconceivable in today’s world of instant social media updating.

Economies of scale were not achieved as expected. Time and again rationalisation decisions were sacrificed because of the inability of the organisation to agree.

The open systems philosophy of harmonising with the client’s technology rather than imposing Reuters’ own turned out to be a double-edged sword. The design of the interface protocols did not adequately de-couple the client from Reuters, so changes got stalled by clients delaying upgrades. It also made the client site developments ever more complex.

The passion for adopting standards, particularly at the client site, was also a double-edged sword. It lead to complicated applications and persistent user unfriendliness in spite of the attempts at a glitzy appearance. 

Generating revenue from back office and allied products turned out not to be as easy as thought. It required deep product knowledge and close integration with the customer’s own technology. Not a recipe for an easy profit.

Development trends towards the end of the 1990s:

  • The ability to battle on with major infrastructural enhancement projects on the main product lines. IDN soldiered on with performance demands orders of magnitude greater than the original design. Although D2000-2 and GLOBEX didn’t reach their product potential, technically they performed to the level required. 
  • World class Windows, PC and UNIX expertise. 
  • As had been inherent since the 1960s, the development and extension of very large networks operating 24/7 with the most demanding performance criteria.
  • Leading edge research efforts in related subjects like communications and web standards.
  • Management of large technical projects.
  • A determination and commitment to improving the quality and predictability of software deliverables.
  • An organisation which started to harness software development expertise in many other geographies, not just London, New York and Hong Kong. However, the West Coast-based initiatives failed to inject as much into the global architecture as had been hoped.
  • And finally, a resilience to the difficult environment that the development groups found themselves in where the latest fad or acquisition was seen as the reason to rip up the design and architecture and start again.

POST CHRONOLOGY ACTIVITIES

Decisions were made to hive off communications into a new independent entity called Radianz. This was a recognition that communications could no longer be a core competence. One of the original technical expertise fundamentals completely disappeared. It caused immediate practical problems due to the involvement of a separate organisation, with its own agenda, into the customer delivery chain. A problem with which most of us are familiar with a broadband provider and the telecoms infrastructure provider being different entities. Makes sense until you have a problem.

The new fledgling organisation went further and marketed itself as a hosting company, or in modern parlance a deliverer of “cloud computing”. This was in anticipation of a move by Reuters to completely embrace the Internet. Conversion of the existing product line to a hosted model proved tricky, of course, and the initiative did not succeed. Others, like Microsoft, Google and Amazon have now got the hang of it and derive a healthy profit from cloud computing. Perhaps it all might have been a different story had the initiative been kept in house and grown organically as it was at Amazon. The lure of Internet wealth might be to blame.

The first attempt to replace Reuters’ information services by an Internet technology-based offering had to be abandoned. It all proved too complicated. There was probably a way but it was never found. Arrogant architectural thinking, perhaps, or maybe it was really just too difficult. Like several attempts in the past (e.g. RSR and Rowgrabber) replacing the original well thought through designs was not an easy task. A replacement historical database system did arise from the ashes though and successfully replaced the aging NCR/RDB configuration.

The trading room business was no longer a revenue star. From its beginnings as Special Products in 1978 through the acquisition of RICH and then TIBCO, it had been a steady contributor to Reuters’ profits. Increasingly the effort involved in keeping up with dealing room technologies and the customer requirements meant that it was no longer the “easy money” of the 1980s and 1990s. Integration at the client site was passed over to IBM consulting. The risk business was eventually sold off altogether. Triarch/TIBCO Finance systems become more a means to support the data business.


ACKNOWLEDGEMENTS

Apart from Michael Nelson, general manager, and Glen Renfrew (who died on 29 June 2006), names in the chronology have been omitted. It would have been too easy to forget some vital contributor and in any case development is largely a collaborative venture with few stars. The content derives from the staff magazine Reuters World, personal recollections, the several books written and some papers of the day that had been kept. 

The Reuters archives were also consulted and the two general chronologies by Jack Henry and John Entwisle were invaluable in adding to the first part of the story. The archive also provided some of the pictures used. I am particularly grateful to Trevor Bartlett who has borne the brunt of the reviewing and has been a never-ending source of helpful suggestions and knowledge.

Chronology 1964-1990 contributions:

John AdamsTrevor BartlettGeoff ChapmanMike CornishDavid CutlerJohn EntwisleMaurice FletcherPeter HowseScott JeffriesLewis KnopfIrving LevineDredge LiversedgeMac MackenzieBill Pinfold, David RussellHerbie SkeeteDavid Ure, George WallaceGlenn WassermanNick WhiteRichard Willis.

Chronology 1991-1998 contributions:

Trevor Bartlett, Mark Daley, Justyn Davies, Mike Dewey, John Goacher, Scott Jeffreys, Peter Kingslake, Irving Levine, Mac Mackenzie, Greg Meekings, Mike Murphy, Annette Stark, Phil Stockton, Glenn Wasserman, David Ure, Richard Willis

Books:

The Power of News - Donald Read

Castro and Stockmaster - Michael Nelson

Tales from the South Pier - John Jessop

Breaking News - Brian Mooney & Barry Simpson


​Martin Davids joined Reuters in 1979 with a degree in applied mathematics and a background in military real time and commercial message switching systems. His first role was as part of a team responsible for getting the Dealing system live. Later he managed Reuters European technical development and headed transactions development in New York where he was also head of information products development. When he left Reuters in 2002 he was head of legacy developments, field and technical operations. ■