Friday, September 29, 2017

A Macro View of Microservices - A Plain Vanilla Primer for Pragmatic Practitioners



The Litmus Test of Enterprise Tech: Change-readiness


Business change is constant and instant, Tech teams needs to be in start-up mode

Markets are getting wider, scaling out is the new norm, so is the adoption of emerging tech

Shrinking Time to Market calls for rapid development in distributed environments backed by continuous deployment

Microservices key value prop: Change-friendliness

Their contexts and boundaries are defined in terms of distinct business capabilities

They are right-sized, invariably small (think scope of service, not lines of code)

They are independently deployable in line with business needs: for instance a new feature or a bug fix would be deployed immediately as also tracked for performance and behavior.

Deployment decisions are choreographed in collaboration with service owners eliminating the need for orchestration across multiple teams which is invariably arduous.

Service owners are free to choose the technologies and persistence mechanisms for building and operating individual services with consensus agreement on cross team parameters like log aggregation, monitoring and error-diagnosis.

Services collaborate with each other using technology-agnostic network calls

They offer a cost-effective and low-risk test bed for evaluating the effectiveness of new technologies in production environments


What does that mean for business?

Scaling is ‘on-demand’ and cost-effective, in line with business needs

Independent deployment, with quick failure isolation and rollbacks, ensures quick value delivery to the market

Ready-to-deploy business capabilities make customer engagement more holistic across different channels

Smaller codebase means significantly lower risk and cost of replacing or rewiring software (vis-à-vis the typical monolith compromise of coping with core modules running on redundant technologies)

Microservices: Here’s the deal

How they deal with change in business requirements


Unlike in monoliths, responsibilities are decomposed in respective services defined by business capabilities, hence change affects only the given module, data segments are API-encapsulated while service overlaps are mapped through higher-order services or hypermedia


How they deal with business capability enhancements or modifications


Bounded contexts enable independent deployment of the impacted service(s) without disturbing business capabilities residing in other services. This eliminates the need for time-consuming and costly regression tests of the monolith universe.


How they deal with situations where business abstractions are dependent on low-level services outside their bounded contexts


‘API gateway’ inverts the dependencies between clients and microservices in this scenario. The secondary abstraction is declared by the high-level abstraction within its service interface, and is implemented by the dependant collaborating services through several means - reducing network chattiness, performing protocol translations, concurrently aggregating service responses and transforming service responses in specific end-user formats


A closer look at API gateways


In the microservice universe, a client’s RESTful HTTP request to individual services can be a painful user experience given the plethora of requests to different services.

Enter the API gateway which tailors APIs to client’s network capabilities.

For instance, a desktop client may make multiple calls while mobile client would make a single request. The API gateway will proxy finely-grained desktop requests to corresponding services while handling coarse-grained mobile requests by aggregating multiple service call results.

Outcome: optimized communication between clients and applications while encapsulation of microservice details.

API Gateways ease the evolution of microservice: whether two microservices merge or one is partitioned into two, updation would be made at the API gateway-level, the clients on the other side of the gateway would be impervious to the change.


Why Microservices call for Continuous Deployment

Microservices are highly amenable to change and continuous deployment makes it rapid and reliable.

Microservices make deployment easier, so that it becomes faster and frequent.

Faster deployment ensures faster feedback from the market.

Faster feedback ensures timely improvements – making it more responsive and secure.


Why Microservices and Polyglot Persistence go together

The microservice approach of multiple teams managing discrete services naturally implies database capability within each service, else coupling at the data level would defeat the very purpose of autonomy.

Using multiple data stores invites eventual consistency which is a known compromise in most businesses. Even relational databases settle for eventual consistency when data is sent to or read from remote systems like value chain databases.

Like how RDBMS uses event streams for constructing reliable views, the microservice world uses event sourcing for triggering service updates from ‘events to log’

The trade-off in favor of high availability becomes even more acceptable when compared to the Multi-Version Concurrency Control issues of the relational world.


Embracing Microservices: No cookbook in the kitchen


Every organization is different – you can’t mirror success stories, or even failures for that matter

How to ensure whether microservices are fit for purpose – the adoption challenge

Microservices demand a paradigm shift - Cultural, Structural and Functional

Compelling benefits come bundled with significant complexities


The Adoption challenge


Greenfield projects

When businesses need to evolve rapidly, the monolith environment may work best for managing small number of applications to deliver the firm’s competitive edge. Microservices however would be helpful to startups in building a minimum viable product.


Brownfield projects

When established businesses need to scale rapidly across large, complex applications, microservices becomes fit for purpose but given the tangled dependencies between applications, an incremental approach to the evolution is highly advisable:

Re-engineer applications based on business priority in phase one

Build API gateways to interface with monolith applications that talk to inefficient database models

Perform minimal database modifications to maintain stateful connections

Re-engineer balance applications in fast-track mode using phase one templates and components

Spilt monolith services into microservices

Normalize relational data models and embrace efficient database models using polyglot persistence


Getting in microservice mode


Pre-adoption Diagnostics


Defining core business capabilities for decomposition into services

Dissecting services in terms of business, processes, constraints, channels and data behaviors to be able to ‘group those things that change for the same reason’

Identifying capabilities to bridge skill gaps of technical team on emerging technologies and best practices


Building the Microservices organization


Aligning the technical architecture to the business organization

Defining service boundaries, appointing service owners and custodians based on optimal team size and maximum productivity.

Promoting autonomy of service owners while enforcing rules for and monitoring implementations of ‘what the services should expose and what they should hide’

Complexities can be overwhelming

Issues of an expanding estate

Host of services

Scores of Processes post resilience

Several Interfaces

Network latency

Network failures


Need for value-added solutions


Versioning and Message serialization

Load balancers and messaging layers

Remote procedure calls

Backward compatibility and functional degradation

Use of home-grown code and off-the-shelf products for high-level automation, roll-outs based on service dependencies

Asynchronicity

Need for solution-minded teams

Visionary technical architects, Competent Custodians, Highly adept DevOps team

Database professionals conversant with polyglot persistence scenarios

Intelligent co-ordination between teams

Democratic governance

Questions that demand credible answers… a partial list

How does one move forward in the absence of standards?

How does one form the Microservice team – domain experts, technical architects, emerging tech champions, networking wizards…How does one bridge the skill gap?

How does one choose technologies, techniques, protocols, frameworks, and tools?

How does one approach design, development and deployment?

How does one identify services?
How does one define service boundaries?
How does one chip off services from the monolith structure?
Why and when does one split services, merge them, or create new services?
How does one deal with service interfaces?
Is there a way to ensure a uniform way to call services?
How does one ensure the quality of code bases? small doesn’t automatically guarantee quality
Can one build coherence and reusability across different components?
How does one tackle the accidents post the ‘first version’ release?
How does one avoid versioning nightmares?
How does one adopt a culture of individual deployment s before evolving into continuous deployment mode:
given the monolith legacy of single unit deployments and regression tests


Summing up

Microservices are an evolutionary phenomenon, not a ready-to-deploy solution

Microservices will ensure measurable value to your organization only if they are Fit for Purpose – irrespective of whether the project is Greenfield or Brownfield.

Microservices Vs. Monolith is not a black and white David Vs. Goliath scenario both have distinct value props.

Microservices are naturally attuned to the virtues of heterogeneous data stores and Continuous Deployment

Microservice trade-offs should be guided by respective trade realities, not by the experiences of other organizations

Thursday, September 28, 2017

NoSQL in perspective: Biz above Buzz, Needs above Names


Random notes based on the seminal book "NoSQL Distilled by Pramod J. Sadalage and Martin Fowler", aimed at enabling faster comprehension



Business needs of the Modern Enterprise

Real-time capture and analysis of big data – coming from multiple sources and formats and spread across multiple locations

Better customer engagements through personalization, content management and 360 degree views in a Smartphone era

Ability and agility in proactively responding to new markets and channels


Constraints of the RDBMS environment

Frequent database design & schema revisions in response to fast-changing data needs have serious application-wide ramifications as RDBMS is the point of business integration

Growing data storage needs call for more computing resources but RDBMS ‘scale up’ is prohibitively expensive

Clustering is an effective solution but cluster-aware Relational DBs can’t escape the ‘single point of failure’ trap in making all writes to an abundantly-available shared disk.

Sharding in RDBMS puts unsustainable loads on applications



NoSQL in perspective

Over time, enterprise with complex and concurrent data needs created tailored non-relational solutions specific to their respective business environments.

They are a natural fit for the clustering environment and fulfill the two principal needs of the modern enterprise, viz,

Cost-effective data storage ensuring fit-for-purpose resilience and several options for data consistency and distribution

Optimal and efficient database-application interactions


It would be appropriate to name this ever-expanding universe as NoSQL, which contrary to what the name implies, is ‘non-relational’ rather that ‘non-SQL’ since many RDBMS systems come with custom extensions. (NewSQL hybrid databases are likely to open new doors of possibilities)

Each data model of the NoSQL universe has a value prop that needs to be considered in the light of the given business case including the required querying type and data access patters. There’s nothing prescriptive about their adoption. And they are not a replacement for SQL, only smart alternatives.

NoSQL data models

A closer look at two common features:

Concept of Aggregates

Group all related data into ‘aggregates’ or collection of discrete data values (think rows in a RDBMS table)

Operations updating multiple fields within each aggregate are atomic, operations across aggregates generally don’t provide the same level of consistency

In column-oriented models, unit of aggregation is column-family, so updates to column-families for the same row may not be atomic

Graph-oriented models use aggregates differently – writes on a single node or edge are generally atomic, while some graph DBs support ACID transactions across nodes and edges


Materialized views

To enable data combinations and summarization, NoSQL DBs offer pre-computed and cached queries, which is their version of RDBMS materialized views for read-intensive data which can afford to be stale for a while. This can be done in two ways:


Overhead approach
Update materialized views when you update base data: so each entry will update history aggregates as well
Recommended when materialized view reads are more frequent than their writes, and hence views need to be as fresh as possible

This is best handled at application-end as it’s easier to ensure the dual updates – of base data and materialized views.

For updates with incremental map-reduce, providing the computation to the database works best which then executes it based on configured parameters.

Batch approach

Update materialized views in batches of regular intervals depending on how ‘stale’ your business can afford them to be



Domain-specific compromises on consistency to achieve:

a. High availability through Replication: Master-slave & peer-to-peer clusters

b. Scalability of performance through Sharding

In each case, the domain realities would matter than developmental possibilities –what level and form of compromise is acceptable in the given business case would help arrive at a fit for purpose solution.

Many NoSQL models offer a blended solution to ensure both High Availability and High Scalability - where sharding is replicated using either Master-slave or peer-to-peer methods.

Replication

Master-slave cluster:

Works best for read-intensive operations

Database copies are maintained on each server.

One server is appointed Master: all applications send write requests to Master which updates local copy. Only the requesting application is conveyed of the change which, at some point, is broadcast to slave servers by the Master.

At all times, all servers – master or slaves - respond to read requests to ensure high availability. Consistency is compromised as it is ‘eventually consistent’. Which means an application may see older version of data if the change has not be updated at its end at the time of the read.

Fail scenarios in Master-slave cluster and their possible mitigation:

Master fails: promote a slave as the new master. On recovery, original Master updates needful changes that the new Master conveys.

Slave fails: Read requests can be routed to any operational slave. On recovery, slave is updated with needful changes if any.

Network connecting Master and (one or more) Slaves fails: affected slaves are isolated and live with stale data till connectivity is restored. In the interim, applications accessing isolated slaves will see outdated versions of data.

Peer-to-peer cluster:

Works best for write-intensive operations.

All servers support read and write operations.

Write request can be made to any peer which saves changes locally and intimates them to requesting application. Other peers are subsequently updated.

This approach evenly spreads the load, but if two concurrent applications change the same data simultaneously at different servers, conflicts occur which have to be resolved through Quorums. If there’s a thin chance of two applications updating the same data at almost same times, a quorum rule can state that data values be returned as long as two servers in the cluster agree on it.

Sharding

Evenly partition data on separate databases, store each database on a separate server. If and when workload increases, add more servers and repartition data across new servers.

To make the most of sharding, data accessed together is ideally kept in the same shard. It’s hence recommended to proactively define aggregates and their relationships in a manner that enables effective sharding.

In case of global enterprises of widely-dispersed user locations, the choice of sites for hosting shards should be based on user proximity apart from most accessed data. Here again, aggregates should be designed in a manner that supports such geography-led partitioning.

Sharding largely comes in two flavors:

Non-sharing Shards that function like an autonomous databases and sharding logic is implemented at application-end.
Auto Shards where sharding logic is implemented at database-end.

Sharding doesn’t work well for graph-oriented data models due to the intricately connected nodes and edges which make partitioning a huge challenge.

Ways to improve ‘eventual consistency’ : Quorums Versioning

Read and Write Quorums

Quorums help consistency by establishing read and write quorums amongst servers in a cluster. In case of reads, data values stored by the read quorum are returned. In case of writes, it is approved by a write quorum of servers in the cluster.

Applications read and write data with no knowledge of quorum arrangements which happen in the background.

The number of servers in a quorum – read or write – have a direct bearing on database performance and application latency. More the number of servers, more the time for read and write quorum approvals.


Version Stamps

Consistency problems can arise in Relational and Non-relational despite ACIDity or quorum rules. A case in point is a lost updates from concurrent access of the same data where one modification overwrites the changes made by other. In business cases which can’t afford pessimistic locking, version stamps are a way out:

An application reading a data item also retrieves version information. While updating, it re-reads version info, if it’s unchanged , it saves modified data to the database with the new version info. If not, it retrieves the latest value probably changed by another application and proceeds to re-read version stamp before modifying data.

In the time between re-reading the version info and changing values, an update can still be lost from a change made by another application. To prevent this, data can be locked in the given time frame in the hope that it will be miniscule.

Few NoSQL models like column-oriented DBS enable storing of multiple versions of the same data in an ag­gregate along with version timestamp. As and when needed, an application can consult the history to determine the latest modification.

When synchronicity between servers in a cluster is in question due to network constraints, vector clocks are seen as a way out. Each server in the cluster maintain a count of updates enforced by it, which other servers can refer to thereby avoiding conflicts.

What ‘schema-less’ actually means?

Onus shifts to Application-side

In NoSQL databases, data structures and aggregates are created by applications. If the application is unable to parse data from the database, a schema-mismatch is certain. Only that it would be encountered at application-end.

So contrary to popular perception, the schema of the data needs to be considered for refactoring applications.

That applications have the freedom to modify data structures does not condone the need for a disciplined approach. Any unscrupulous changes in structure can invite undesirable situations: they can complicate data access logic and even end up with a lot of non-uniform data in the database.

Wednesday, September 27, 2017

Big Data, Bigger Names

Sudhir Raikar , IIFL | Mumbai | March 08, 2016 09:15 IST

Even as companies big and small are busy declaring their Big Data initiatives ahead of driving them, IIFL pays tribute to a few nonconformist champions – some central, other tangential – who together lend meaning and substance to the over chewed buzzword, thanks to their intuition, insights and inquisitiveness.




- See more at: http://www.indiainfoline.com/article/editorial-perspectives-technology/big-data-bigger-names-116030800132_1.html#sthash.6KGIry2R.dpuf

In a world of unabashed corporate antagonism, replete with umpteen “founding” and first-mover claims to breakthrough ideas, concepts or methodologies, certain mavericks stand out for their quiet authority.

Like computer scientist John Mashey, founder of the ASSIST assembler language teaching software and author of PWB Unix shell, or "Mashey Shell". He’s arguably believed to be the father of the term Big Data, having christened it in 1994 in a remarkably matter of fact fashion while he was chief scientist with Silicon Graphics, then a hot and happening Valley player working on Hollywood special effects and spy surveillance systems and hence playing with a lot of data.

Devoid of any academic attribution save for numerous technical talks, thankfully available on websites devoted to technical research, Mashey has only his unflinching conviction to fall back on. He doesn’t need to simply because he’s not staking any claim. Instead, he selflessly right sizes the imagination of people keen to confer the founding title on him, humbly summarising the coinage as only an attempt to settle on an all-inclusive phrase to convey the explosive growth and advancement in computing. This hiking, biking, skiing enthusiast is too busy with his intellectual and creative pursuits to seek reverence for his prescience. This introduction slide from one of his technical presentations (http://www.slideshare.net/amhey/big-data-yesterday-today-and-tomorrow-by-john-mashey-techviser) is a good window into his talent and temperament.




Like Gartner data analyst Douglas Laney, who first recalled Mashey’s name - in the context of big data - through a media correspondence. Douglas is the author of the 2001 pioneering research note 3-D Data Management: Controlling Data Volume, Velocity and Variety and among the earliest to discern that more than growing volumes, it was the data flow speeds, thanks to the collective handiwork of e-commerce and post-Y2k ERP application boom that posed a real challenge to data management teams worldwide. As expected, several vultures from the unabashedly ambitious market place claimed Laney’s research as their own, peddling muddled replications and variations of his 3-V (Volume, Velocity and Variety) framework. Laney’s retort befits his nonconformist nature. He’s posted the contents of his original paper (sadly no longer available in Gartner archives) “for anyone to reference and attribute”. Here it is: http://blogs.gartner.com/doug-laney/deja-vvvue-others-claiming-gartners-volume-velocity-variety-construct-for-big-data/

Like etymologist, editor and Yale researcher Fred Shapiro who traces the origin, development and spread of words as a means to study intellectual evolution, not for academic posterity.

Like University of Pennsylvania economist Francis X. Diebold, who initially claimed to have coined the term in his paper “Big Data Dynamic Factor Models for Macroeconomic Measurement and Forecasting,” but later wrote another research paper to humbly reverse the claim, circuitously acknowledging Mashey’s contribution. To quote him, “The term “Big Data,” which spans computer science and statistics/econometrics, probably originated in lunch-table conversations at Silicon Graphics Inc. (SGI) in the mid 1990s, in which John Mashey figured prominently.”

And last but not the least, like award-winning journalist Steve Lohr, author of the definitive software chronicle “Go To: The Story of the Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts — The Programmers Who Created the Software Revolution” and “Data-ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else.”

Mashey’s deep connect with Big data came to light through Lohr’s perceptive 2012 search for the term’s origins in loads and loads of digital archives. It was at Lohr’s behest that Shapiro dug out several digital references to trace the origin of Big Data. When he could not come up with anything conclusive, Lohr approached people with knowledge of the subject matter and Diebold and Laney were one of the many people to respond.

Unfazed by the inconclusive results of his hunt, Lohr kept it going, looking for the two words, not merely used as a pair, but used in a manner that would connote the essence as we know it today: massive volumes of structured and unstructured data that move too fast and call for new ways of management. Such usage, Lohr believed, could only be steered by someone with a computing context. Precisely why he zeroed in on Mashey, not on other intriguing but out-of-context references like these two lines from bestseller author Erik Larson’s Harper’s Magazine piece on mailbox junk spread by the direct-marketing industry: “The keepers of big data say they do it for the consumer’s benefit. But data have a way of being used for purposes other than originally intended.”

Hats off to Lohr for his inquisitive and informed search for the name of a phenomenon that’s a now a household name across spheres. Companies flaunting their smallest of Big Data initiatives would do well to learn from Mashey’s prolific nonchalance and Laney’s altruistic activism. Armed with the duo’s frame of mind, they would be in a better position to lock horns with the multihued Big Data challenges including curation, updation and integration. Read all about Lohr’s account in this dated but delightful piece: http://bits.blogs.nytimes.com/2013/02/01/the-origins-of-big-data-an-etymological-detective-story/?_r=0 - See more at: http://www.indiainfoline.com/article/editorial-perspectives-technology/big-data-bigger-names-116030800132_1.html#sthash.6KGIry2R.dpuf

“To me, bancassurance is more about going deep rather than going wide”

courtesy: http://www.indiainfoline.com/article/editorial-interviews-leader-speak/vighnesh-shahane-ceo-wholetime-director-idbi-federal-life-insurance-117092600279_1.html



A former cricketer known for his Jeff Thompsonesque sling‐arm bowling action, 48‐year old CEO of IDBI Federal Life Insurance Vighnesh Shahane seems to have maintained a good line and length in managing the affairs of a joint venture ‐ between IDBI Bank, Federal Bank and Belgian insurance major Ageas, three entities with diametrically diverse legacies bound by an intrinsically common goal ‐ which is also believed to be contemplating a stake sale. Shahane’s clarity of conviction is evident in the way he pinpoints key challenges and opportunities of the insurance space, and outlines his corporate plans and priorities in this interaction with Sudhir Raikar. Edited excerpts...

Do you feel your unflinching bancassurance conviction has kept you in good stead, at a time when most insurers have gone the ULIP way?

I will answer this question in two parts. On the Bancassurance channel, established infrastructure and loyal customer base obviously make it a preferred choice of life insurance companies. In our case, being a joint venture of two prominent banks ‐ IDBI Bank and Federal Bank ‐ and Ageas, we had ample room for growth amid high competition and our performance bears testimony. We were among the few life insurance companies to break even in a mere five years of operations. Our gross written premium (GWP) has almost doubled in the last three years. In fact, for the quarter ending June FY18, we recorded a 65 per cent year‐on‐ year increase in new business premium (NBP) in the individual insurance segment.

Talking about ULIPs, we truly believe that it’s a great product in its current avatar. However, customers yet lack a good understanding of the ULIP nitty gritty. There have been instances when they came back with grievances. We don’t intend to chase ULIPs at the cost of jeopardizing our bancassurance relationship and hence, we prefer to sell them to discerning customers who understand the product well and are not swayed by transient market volatility. During Q1 FY 18 ULIPs comprised 17% of our total New Business Premium.

Does the cocoon of high persistence, low-cost banca channel unknowingly restrict the scope for growing the digital and agency channels?

High persistency and Banca channel are our strategic growth drivers, not really a cocoon limiting our creative thinking. We continue to leverage other growth channels. We have a growing digital channel and an agency channel, both of which we are nurturing in a calibrated and profitable manner. Besides, we also have other channels like Group, Broking and NR, and we were one of the earliest entrants into the newly introduced POS channel. We are also on the lookout for new bancassurance tie ups to strengthen our distribution.

Do you expect the online route to fetch better outcomes in the time to come, more so if customers demand simple, transparent and fast interactions?

Life insurance is essentially a push product, requiring a face‐to‐face interaction. Products require long‐term commitment and hence assistance from some expert acts like an assurance for the buyer. I don’t see human intervention and dependency reducing immediately, unless AI or any other technological innovation suddenly transforms the purchase experience. Having said that, there’s great potential for simpler and pure term products in the online space. This is because of the increasing awareness about the need to safeguard the financial future of one’s loved ones as also the low‐cost awareness surrounding online products. I strongly feel there’s immense scope for product innovation through digital.

How is IDBI Federal Insurance placed on the tech innovation front?

For us technology is driven by purpose. Being a medium‐sized company, it is pertinent we drive tech innovation only after thorough evaluation. To avoid needless rollout delays, we have a dedicated team focused on ushering in new ideas and innovation. However, every implementation is subject to scrupulous evaluation.

This year we have launched two new IT initiatives. The first is our mobility platform whereby we have developed a tablet application to enable our sales teams in selling our products on the digital platform. We have named this “On the Go” keeping in mind the real benefit it brings to the salespeople and their customers. We have conducted a pilot of this tablet‐based sales model and the results are very encouraging. Interestingly, our late entry into this platform helped us learn from others which in turn helped us launch it in record time. Where other companies have spent years in solution development, we launched the pilot in just 7 months. We shall go live with 100 users in Banca and gradually cover the entire sales force.

The second initiative is the new IDBI Federal website. Our new website is built for the digital consumer and addresses all kinds of visitors – explorers, buyers and on‐boarded customers. The new front end (that the customers see) is supported by a completely new backend. Not only is the look and feel new‐age, the engine driving it is also state‐of‐the‐art technology, developed keeping the future in mind. Besides these, we have also implemented a Workflow Management System which has reduced our turnaround time for issuance by 75%.

What are your thoughts on the fate of open architecture in India?

Open architecture has not really taken off in India. To me, bancassurance is more about getting deep rather than going wide. Banks have their own products too; life insurance is not their core offering. So, it requires lot of time and effort to egg them on to sell life insurance products. It is essential for a bank to tie up with an insurance company with which it has strategic and cultural alignment, and build on the relationship thereafter.

What’s the biggest distribution-related challenge in your reckoning?

The biggest distribution challenge for IDBI Federal is how to enhance the productivity of existing distribution network. As I mentioned earlier, though bancassurance is the largest contributor to our business, we are still scratching the surface. To realise the full potential of this channel as also other channels, it’s imperative to enhance their productivity.

You seem to have steered clear of the Health & Pension space.

There is no plan as of now. Medical costs are increasing and there are good opportunities. But it’s not our playing field as of now. Likewise, we don’t have immediate plans to go deeper in rural areas.

What was the underlying thought behind your claims guarantee scheme? How has it fared?

One of the biggest challenges that the industry faces is how to woo the customer back. For the customer, the most critical part of their relationship with their insurer is at the point of making a claim. Through our claims guarantee scheme, we aimed to settle claims in just eight working days. In case we fail, we would pay an interest of eight per cent per annum on the death claim amount for each day of delay beyond eight working days. We have not had to pay a single penny as interest since the launch of this initiative in 2014. On its positive impact on the brand, IDBI Federal was declared one of top ten most trusted life insurance companies of India as per Economic Times Brand Equity Survey.

How do you see life insurance products evolving over time? How disruptive would the waves of Big data, Machine learning, IoT and value-added analytics prove for the sector?

Digital is not just the purview of the handful of people in the digital team. It is a culture that needs to be spread across the organisation. We may introduce many initiatives in the ‘digital’ space but very few would succeed unless we have digital embedded within the organisation’s culture.

Digital and technology are double edged swords. With every step in the positive direction, there is a potential downside that you need to shield yourself against. In the case of IT, the downside comes in the form of cyber risk and the threat on data security. One, therefore, must tread the digital path with care and after thorough evaluation. It’s not an ‘either or’ business case, a careful balance needs to be maintained.

What’s also important in the race to get more tech oriented and digitally savvy is the relentless focus on the consumer. Else, a business may get overawed by technology and could adopt it without a strong consumer benefit attached to it. We are wary of this and always keep the consumer filter on while evaluating tech.

Where do you see IDBI Federal in three years from now, vis-a-vis competition from public and private players?

Nobody can predict what the future holds despite having ambitious growth plans for the company. Our gross written premium (GWP) has almost doubled in the last 3 years. In 2013‐14 our Total Premium was Rs. 826 crore, and in 2016‐17 we closed the year at Rs. 1565 crore. Two years ago, this seemed impossible, but we did it. Our basic goal is to keep performing better than the industry average.

Do you expect steady growth both in NBPs and Renewals going forward or would it be skewed in favour of one of these?

NBP grows faster than renewals in early years but as a company gets tenured, the renewal growth becomes significant. Our persistency across buckets is one of our strengths. Our surrender ratio is also one of the lowest. We have an equal focus on new business and the business staying in our books.

What are your views on the possibility of consolidation in your sector in the coming time? What's your take on the inorganic route to growth?

Consolidation and listing are the new normal for the insurance industry. Speaking of inorganic growth, we have no plans as of now. Our immediate priority is to fortify Banca, re‐energize Agency as well as incubate new channels. We are unflinchingly focused on profitable, all‐round growth and value addition for our shareholders, customers and employees.

Friday, September 22, 2017

Dear Pradyuman


We know you now keep vigil from up above. We know you have had to take matters in your own hands, going by the sorry state of affairs down below. Things seem to be falling in place only because you are now in charge of your own case. With the CBI now in the picture, we hope justice is on its way. But the larger truth continues to haunt us.





As a nation of rogues across spheres - polity, education, medicine, health care, judiciary, police, business, art, leisure, media, sport, science, technology, religion and spirituality included - and of helpless bystanders like me who can do nothing more than pay condolences, we have collectively failed you. All our progress since independence - strides in outer space, laughable IT superpower claims, rich heritage & culture ball talk have come to naught. Please forgive us. And please forgive our talk show specialists (especially school principals, educationists, columnists and tinsel town celebrities) for their politically correct media bytes, dramatic posturing and even pseudo poetry.

We were introduced to you only after your demise but it didn't take long to realize that you are very special, just like your name Pradyumna, arguably the only three-lettered Sanskrit word with all letters joint (जोडाक्षर) Please give us the strength to come to terms with the fact that you are no longer with us.


Dear Bloomberg Businessweek

On September 8, India woke up to one of its worst tragedies in recent times when seven year old Pradyman Thakur was found brutally murdered in the toilet of his school Ryan International in Gurugram. I feel Bloomberg Businessweek should do an in-depth story on this national disaster as also on the money-minting business of Education in India. No wonder, several fraudsters thrive on their nation-wide school and college chains, largely helped by scheming minds, political clout as also unsuspecting (read unmindful) parents who take the 'international' tag at way more than face value.

Your Indian coverage is extremely sketchy anyway (compared to your China bytes) and some of the reports on India's IT challenges in the Trump era and demonetization were pretty mediocre. The best Bloomberg piece on India was Ben Crair's report titled "Maniac Killers of the Bangalore IT Department." It would be great if Ben covers the Ryan episode as well.

Little Pradyuman awaits justice, what if posthumously. Falling standards of journalism, particularly in India, have made us highly cynical about our expectations from the media. Bloomberg is one sweet exception. Your reportage goes way beyond business matters and seems to trigger actionable insights, unlike many Indian publications and even a few reputed global names. A Bloomberg story could go a long way in making the world aware of Pradyuman's tragedy which is, and should be, ours in the same breath.


Regards

Friday, September 08, 2017

Fed onward, Feb onwards


Courtesy: My thought piece published in IIFL Wealth Market Xpress

With Fed Chair Janet Yellen’s four-year term formally ending in Feb 2018, speculation is rife over her prospective successor. The bigger question, however, is whether Trump’s ultimate choice would help the Fed reinforce its credibility as the world’s premier central bank.

Sudhir Raikar

Ever since President Trump named Gary Cohn, his chief economic advisor and former Goldman Sachs luminary, as one of the likely candidates for the Fed top post, observers and experts are theoretically listing out pros and cons of having Cohn at the helm of the country’s central banking system. Though Trump’s short list comprises a few other contenders, including Yellen for a possible reappointment given his new-found respect for being a ‘historically low-interest person’, the name of Cohn is in furious circulation. In striking contrast, recent reports that he’s leaving the White House, miffed with Trump over the press conference following the Charlottesville violence, are also doing the rounds. The White House has rubbished the prophecies as being baseless and untrue, stating that Cohn remains staunchly focused on his job as the NEC director.

Whether Cohn stays at the White House or not, whether he takes charge of the Fed or not, his life story reads like a silver screen blockbuster plot. A dyslexic of humble beginnings, he sold window frames and aluminium sidings on weekdays while steering an undying passion for financial markets on weekends. A chance conversation with a commodity exchange trader en route a cab ride to the airport opened the prospect of making it big in the uncharted territories of Options Trading. Cohn knew nothing about Options till that point but cleared the litmus test in convincing fashion, courtesy Lawrence G. McMillan's "Options as a Strategic Investment".

He never looked back ever since. Thanks to his razor-sharp intellect and never say die spirit, he rose from strength to strength in strikingly offbeat fashion and went on to become president and COO of the legendary Goldman Sachs, sans the default passport of an Ivy League background. If he does make it to the Fed, this would be without the customary Ph. D in economics. In many ways, Cohn would be an alien at the Fed office, much like Trump is at the White House, as both bring their distinct brands of corporate bluntness to the governance table. Opinion is largely divided on his Fed appositeness though.

His backers find him tailored for the role, a compulsive doer who would usher in the much-needed trading floor aggression to the Fed’s conference room, known for its stoic consensus-driven contemplation seeped in economic conjectures. He has already left his indelible mark at the White House. Entrusted with the task of reforming the US tax code and reimagining US infra priorities, he’s known to speak his mind, and his fearless candour is diametrically opposed to the measured White House diplomacy. His fierce pro-global stand on trade restrictions or even the failed attempt to dissuade Trump against withdrawing from the Paris Agreement bear ample testimony. It seems highly unlikely that he would become Trump’s man or even a ‘Goldman’ in deciding the course of the nation's economy as the Fed chief.

And then there are others who find him wholly unqualified for the emblematic Fed cause. They argue he would push the need for a weak currency as a silver bullet stimulant for economic growth and necessarily look at every policy from the slender viewpoint of financial markets, which would likely make the supposedly apolitical Fed an unmindful bailout machine all over again. Cohn, in his Goldman Sachs avatar, had condemned the Fed for confusing the markets, injecting liquidity into the system on one end and advising banks to be wary of lending on the other. Having said that, what he would do as the Fed Chief is another story.
The nation-wide deliberation on the new Fed chief – going by the coveted opinion polls and surveys – seems knowingly or unknowingly centred on politics and personalities. It should instead be focused on likely market outcomes: what would the differing perspectives of different probables mean for the Fed? Would they help make the institution more relevant to the US (and the world) or would they leave it even more obscure over time?

For long, the Fed has, or for that matter most central banks across the globe have, upheld rather theoretical notions about the largely imagined ‘desired’ economic effects of low interest rates, quantitative easing programmes and bond roll-offs. If financial markets have been stable for quite some time now, how much of it can be attributed to Fed’s reading of inflation and full employment, no one can tell with certainty. And this is not to take away anything from the Fed’s regulatory and supervisory credibility. Going forward, the Fed’s challenges would only get more complex, especially in the light of growing geo-political tensions across the globe as also looming fear of economic downturns, purely going by the law of averages. Sound advice on what the Fed should be doing is abundant even within the Fed community. Ask Neil Kashkari, Federal Reserve Bank of Minneapolis President about the virtues of waiting for discernible signs of wage and price pressures before opting for rate hikes or his predecessor Narayana Kocherlakota who firmly believes monetary policymakers need to rely less on set rules and more on discretion to ensure better outcomes.

Given the Fed’s hardcoded fixation with unemployment rates, it will take someone persuasively pragmatic to steer the monetary policy in a way that doesn’t overreact to the fears of rising inflation levels or the risk of evolving asset bubbles. Whether that someone would be Cohn or Yellen, or for that matter former Fed guv Kevin Warsh, Columbia professor Glenn Hubbard or Stanford academician John Taylor, why and how should it matter?



Thursday, September 07, 2017

Interesting memoir, Moving Au Revoir


Sudhir Raikar, IIFL | Mumbai | August 11, 2016 16:43 IST
Courtesy: http://www.indiainfoline.com/article/general-life-style-book-review/interesting-memoir-moving-au-revoir-116081100752_1.html

With his no holds barred memoir, former RBI Governor Dr. Duvvuri Subbarao has inadvertently unleashed a new literary genre, a thriller of a primer. Sudhir Raikar takes a closer look at the style and substance of his enduring work which ex-ICICI Bank chief KV Kamath aptly calls ‘unputdowntable’.




Media is abuzz with citing, interpreting and analyzing the seemingly contentious issues of 'Who Moved My Interest Rate?' conveniently ignoring the delightfully enduring aspects of the incisive memoir. Hardly a surprise that, given the typical fourth estate obsession with theatrical story-telling, thriving on sensationalism that, more often than not, is bereft of sense.

But the undeniable fact is, Dr. Subbarao’s tell-all book is an endearing primer for posterity that unfolds a rainbow of his emotions - while at the helm of the Reserve Bank of India (RBI) in what was a terrifyingly tumultuous tenure – including fear, anxiety, hope, surprise, shock, delight, contentment, lament and predicament. He provides the context to each conflict he faced which in turn tells us more about the man’s stoic character and his resolute mission – beginning with the 2008 global meltdown and his struggle to shield India from the unforeseen tremors of an intertwined financial world (but with little say to emerging market economies) and ending with the nasty rupee fall of 2013 which raised lethal questions on the lack of forex build up in the relatively happy years. Interspersed in between was the chronic fight against the government’s hardcoded stance on the age-old Growth vs. Price Stability debate where Dr. Subbarao was implicitly expected to toe the line, more so given his long, eventful stint on the ‘other side’.

Dr. Subbarao’s sincere account, among other things, brings to light the pathos of the Governor’s job where acknowledgment of short-term pay offs is ephemeral while the unforgiving evaluation of long-term consequences in the light of hindsight experience seems eternal.

The innovation that Dr. Subbarao has steered towards making the RBI federal in thought and action – free of hierarchies and confirmation biases – is a case study for both public and private sector players. Wish we had many more CEOs with Dr. Subbarao’s vision and values. The quality of corporate sector governance is as big a problem as are cyclical ups and downs and external shocks. And there's a lot to learn from the prudence and precision of Subbarao’s outreach programs towards making our social development initiatives fit for purpose.

Dr. Subbarao’s effort to free each concept of its intimidating jargon - more so for the novice reader - is evident across all chapters, which makes this book a treasure trove for students of economics and finance in particular. Every illumination has been made in simple language with a discerning desire to reach out to the common people who don’t have the time, inclination or intellect to decipher inflation numbers and interest rates – whether of the measure of financial integration, double-edged sword of globalization, supply-side triggers of inflation, perils of working with flawed data, myth of inflation targeting being opposed to growth, RBI’s unflinching transparency on government’s fiscal stance, the typical central bank quandary in striking a balance between the demands of the privileged, vocal industry fraternity and the mute voice of the common people yearning for lowering of prices, intricacies of monetary policy transmission and ensuing liquidity management through instruments like Open Market Operations, RBI’s lesser known roles and responsibilities including its social development agenda, domestic issues behind the currency slide beyond the taper tantrum trigger, challenge of exchange rate management and the attached Do something vs. Do nothing dilemma, tenets of financial inclusion or modus operandi of NBFCs.

Even readers who feel they have little to do with economics and finance will find the revelation absorbing, one that will evoke a powerful imagery of things you don’t expect from a mainstream book by a bureaucrat – capturing vivid details of the structure and layout of the Governor’s office and the timeless charm of his British-brand BHNS-maintained residential bungalow with ‘its wafting fresh air, cacophony of birds and heavy bunches of jackfruit’. Don’t miss his incisive interpretation of Jean Paul Sartre’s existentialist observation ‘Man is condemned to be free’ in the context of the question on what he would have done differently as the governor of the Reserve Bank.

In wonderful contrast to his profundity on philosophical issues is the ex-governor’s passion for adventure and recreation which is evident from the umpteen references throughout the 300-odd pages – whether his interest in 24-carat Bollywood products like Chennai Express and offbeat Prithvi Theatre plays, craving for Tardeo Maharashtrian eateries or penchant for inquisitive history tours around South Mumbai in ardent appreciation of the city’s emblematic blend of diverse architectural styles. And his wit, a recurring highlight of his insightful account, is superlative to say the least. Of course, it’s best read than cited.

The engaging anecdotes in the course of his pan-India journeys for outreach programs and financial inclusion initiatives are truly inspiring. Wish our branded activists take a cue from Dr. Subbarao’s freewheeling, ventilated approach to financial literacy. Thanks to the arid, bureaucratic mechanisms of conventional NGO bodies, proletariat activists and CSR practitioners across the globe, social responsibility, knowingly and unknowingly, has come to harbour several blatant assumptions about the larger cause of end-beneficiaries (often generically slotted as ‘target groups’ or ‘deprived’ communities) Conveniently overlooked in the process is the plain fact that their deprivation is only circumstantial and in no way indicative of the instinctive and intellectual capacities inherent within the community. Contrary to popular perception, the supply-side forces, in the mad rush to emancipate the downtrodden, are themselves found deprived when it comes to even reading the minds of the audience, leave alone identifying its needs. In peddling their jargon-heavy black and white prescriptions on financial prudence and general well being, they are knowingly and unknowingly oblivious of the expressions of playful amusement and suppressed yawns that the so-called ‘deprived’ reserve for the seemingly ‘privileged’ - - stemming more from doubt than disbelief.

As for those from the journalistic tribe who wished Dr. Subbarao was more alpha male during his tenure, he has shown the virtues of a public beta release in the form of his book that now allows anyone to download his thought process. How many governors would attempt such introspection for open dissection which also talks of what he felt he could have done differently – like the need to adjust the stated policy on foreign exchange and make it more specific with respect to defining and managing volatility and building self-insurance?

Dr. Subbarao’s submission of the ‘twinges of guilt at the thought of millions of Mumbai slum dwellers under leaky roofs for whom the rain meant the loss of daily earning, and hungry children’ is particularly moving, given that he candidly shares his helplessness rather than choosing to overlook the obvious in line with the implicit demands of his official stature, for such predicaments, protocol says, are deemed too poetic for certified comfort.

Given Dr. Subbarao’s conviction in sharing his no holds barred account, the real tribute to his effort won’t merely be the book’s critical acclaim; but some quality introspection by the powers-that-be as to how could the Government-Central Bank relationship be made more cohesive and solution-centric without diluting the sovereign fabric of the latter that we are all proud of.

Central bankers across the globe, we have seen time and again, are compelled to facilitate government access to near-free debt under the guise of fiscal spends. This ends up building a toxic cocoon for government debt issuance in the name of achieving growth targets. The haphazard lending that follows, eventually leads to systemic chaos in the form of rampant bank failures and consequent turmoil in bond and equity markets.

A healthy synergy between the Central Bank and the government should lead to more credible and sustainable solutions to various problems that stand in the way of India’s economic development. Prime among them is the NPA issue. It’s common knowledge that dealing with doubtful and distressed assets has always been the Achilles Heel of the banking sector. Barring a few players known for their stringent lending norms, most bankers try and downplay the whole issue through the usual philosophical sermon: That NPAs are an integral part of banking given the criticality of broad-based operations to profitability which exposes banks to all kinds of unavoidable factors like economic downturns and political upheavals. So, they claim, even the safest of loans can be rendered unproductive…

High time we stop hiding behind these lame excuses and collectively address some tough questions that make NPAs more elusive than what meets the eye. There’s no doubt that more RBI enactments would follow in the time to come. But unless we turn our attention to the fundamental questions surrounding NPAs, every RBI intervention will always seem more palliative than curative.

Hope the new wave ushers in an environment of proactive prudence that penalizes banks and auditors for suspect motives that serve as a green house for NPAs. This way, banks, ARCs and their regulators would be left to deal only with the genuine cases of NPAs. For the Indian banking sector, that would a big leap forward.

It’s high time we also demystified the glorious economic abbreviations that fuel a debate among practising economists and fiscal experts and yet mean little or nothing to the common man. Rather than board ceremonial flights of imagined realities consequent to the published data, our experts would do well to demand a governmental initiative to simplify the data for better public comprehension. Needless to say, caring for this precision and validation is the collective responsibility of the government and the private sector.

As Dr. Subbarao astutely reflected during his inaugural address at the July 2011 Statistics Day Conference: “The decisions that we in the Reserve Bank make have a profound impact on the macro economy, and errors can be costly. Our policy judgement should therefore be based not only on state of the art skills in data analysis and interpretation but also on an intellectual value system of ruthlessly honest validation and peer review.”