Wednesday, 30 November 2016

Back to the future for IBOR

The complexity of the investment management industry is growing and the data that needs to be analysed is richer than ever before. This is a consequence of the thinning geographical boundaries within portfolios and the search for alpha that drives managers to incorporate different and more esoteric asset classes within a single portfolio.

The impact of this changing environment has been a resurgence in the industry's use of the Investment Book of Record (IBOR), a central and comprehensive source that tells the complete story of a firm's portfolio activity. An IBOR provides a timely view of a firm's exposures, portfolio positions and cash. The fullness and clarity of the picture it paints means that it provides the intelligence and insights on which many portfolio decisions are made.

Monday, 14 November 2016

The new Basel IRRBB: regulatory and internal consequences

Last April, the Basel Committee issued its new standard on the interest rate risk in the banking book presenting a new standardised framework. This new standard is to be implemented by 2018. Here Xavier Dubois, Senior Risk & Finance Specialist for Wolters Kluwer’s Finance, Risk and Reporting business looks at some aspects of the standardised framework, how it could be implemented in Europe and its interest for the bank governing bodies.

In April, the Basel Committee on Banking Supervision issued standards for Interest Rate Risk in the Banking Book (IRRBB). The standards revise the Committee's 2004 Principles for the management and supervision of interest rate risk, which set out supervisory expectations for banks' identification, measurement, monitoring and control of IRRBB as well as its supervision.

In a nutshell the new standard realises a significant improvement in the management of interest rate risk in the banking book. Not only does it provide a standardised measurement closer to economic reality, and thus more useful for the bank management, particularly in this time of low interest rates, but it also provides standardisation that increases transparency, not only from banks, but also from supervisors. Banks will have to adopt this new framework and should take this opportunity to move towards a technologically sound and solid risk framework with automation and integration, for supervision and, last but not least, for the governing body.

Tuesday, 8 November 2016

Why banks need consumers to detect imposters

In the first half of 2016 alone, there were more than one million incidents of financial fraud, an increase of 53 per cent on the same period last year; with identity fraud against individuals costing an estimated five billion pounds last year.

Identity fraud occurs when an imposter pretends to be someone else. To prevent this, banks ask customers for passwords, but judging from the fraud figures, this isn’t working and things are getting worse. The reason is simple: data cannot differentiate. A password provided by the true customer is exactly the same when that same password is provided by an impostor.

Wednesday, 2 November 2016

Key to the highway: The changing face of high and low touch execution

In the beginning, there was high touch where brokers provided a high-value, solution-based approach to finding the liquidity their buy-side clients were looking for. This worked in an era of high fees and low scrutiny of what end investor trading commissions were actually funding. But as markets electronified, and buy-side operations tooled up, a new paradigm was born, low touch. This reflected the buy-side's growing desire for cheaper execution, especially for trades that weren’t that hard to execute, and it also offered a path that minimised information leakage.

The result? Two routes to market with very different price tags. The problem was that brokers had to duplicate their trading infrastructure despite receiving fewer net commission dollars. This spawned the short-lived concept of mid touch which offered the worst of both worlds: junior sales traders with neither the experience nor the expertise to manage either. And so the industry muddled along ignoring the operational overhead of running two technology stacks.