Brett King

Archive for October, 2009|Monthly archive page

An organization structure that doesn’t match customer behavior

In Retail Banking, Strategy on October 26, 2009 at 13:11

Excerpt from Chapter 2 – Measuring the Customer Experience

By examining the behaviour of customers, the glaring realization is that institutions are essentially assuming that customers only ever use one channel at a time to interact with them. Hence, it is not unusual to find a web team that believes that it can take 30-40% of branch traffic and service it online. Likewise it is not unusual to hear proponents of Branch banking telling us “the branch is back” and that the winning strategy is to be investing in more real estate and variations of branch to retain customers. It’s also not unusual for customers to receive dozens of direct mail offers, email marketing offers or sms promotions from different ‘revenue centres’ within the bank independent of each other.

In 2008 90-95% of daily transactions are done electronically and in most cases the majority of transaction volume comes through direct channels namely ATM, Call Centre and Internet. By February of 2007, HSBC in Hong Kong reported in the South China Morning Post that 90% of their daily transactions were through phone, Internet or ATM, leaving the rest to branch. RaboBank, FirstDirect, INGDirect, and others have been able to successfully operate without any reliance on branch structures. This is not a criticism of branches, because we believe that branches will remain an essential part of the future of banking. However, look at the organization structure of most banks today and you’ll see a complete and total lack of understanding of customer behaviour inherit within the organization chart. It’s really quite appalling that the organization structure of many banks have not caught up with this reality.

When you examine the organization structure of most retail banks, the Head of Branch networks is second only to the Head of Retail, and in many cases is a direct report to the CEO. In comparison the manager responsible for Internet often sits under the IT or Marketing departments three or four levels below the organizational equivalent of the branch business unit lead. So let’s get this straight. 90% of the transactions go through channels that are managed by managers who have only a modicum of influence within the organization structure, while the head of Branches has the ear of the CEO and looks after just 5-10% of the daily traffic within the bank.

Figure - Partial Retail Banking Org Chart as it relates to channel priorities

Figure - Partial Retail Banking Org Chart as it relates to channel priorities

“Ah, but the branch generates all the revenue…” we’ve heard it argued. This is a really good justification for keeping traditional structures in place. Well let us really examine if that is the case.

Let us take credit card acquisitions as an example. How do we market credit cards? Currently we might use direct mail, newspaper advertisements, web and possibly promotional marketing offering a ‘free gift’ if clients sign up for a new Visa card or Mastercard. Customers are then faced with probably two or three choices of how to apply. The first option is that they can call the call centre, but the call centre refers them to the branch because they need to present proof of income and proof of identity to an officer of the bank. The same might be the case for the internet, where the application can be filled online, but we then call them and ask them to come into the branch to complete the application.

Who gets to record the revenue for the credit card application? Not the call centre, or the internet channel. Often it is the physical branch that executes the final signature on the application form and the KYC compliance check on the proof of income – so it happily records the revenue of the sale. But the branch has actually had practically zero involvement in the sale, and simply is just a ‘step’ in a required adherence to an outmoded compliance process. So does the branch actually generate the revenue, or is it merely an accounting treatment?

The attitude of many retail banking senior executives seems to be that the branch is a serious banking channel, whereas the remainder of “alternative” channels are just that – alternatives to the ‘real thing’. The problem is that customers simply don’t think like this. They don’t assign a higher value or priority to the branch; they just see it as one of the many channels they can choose to do their banking. In fact, many customers these days choose not to go the branch because they don’t want to stand in line, or they find it troublesome to get to the branch at times when they are open. Admittedly the branch is the premium service channel, but it is not the ONLY channel. So why don’t the banks think the way customers do?

The longer banks choose to reinforce a belief that the branch is superior within the organization structure, the longer it will take them to match the performance of the bank to the changing behaviour of retail customers.

After Moore’s Law

In Technology Innovation on October 26, 2009 at 04:46

Excerpt from Chapter 9 – Deep Impact – Technology and Disruptive Innovation

Looking further into the future there are really only two promising solutions that will replace the silicon paradigm that underlies the flawless performance of Moore’s Law to-date. Those two solutions are Quantum computing and DNA or Biological computing.

Quantum computing essentially utilizes the quantum state of qubit (the equivalent of a normal bit/bite in computing terms but at the quantum level). Like a traditional bit, a qubit has an on and off state, but whereas a bit can ONLY be 1 or 0, a cubit can also produce a superposition of both states. Thus, depending on configurations, implementation, the principles of entanglement and superposition (quantum mechanical phenomena) a quantum computer will likely operate of an underlying bit structure that contains at least 8 different three-bit strings. But because of the nature of quantum mechanics, it can simulate the calculations of almost any combination of results simultaneously.

This means a completely different type of programming would be required, but it results in massive computing power. Programs, calculations or simulations that would take weeks, months or even years to complete on today’s platforms could be executed in real-time almost instantly. Chips the size of a grain of rice would be more powerful than today’s supercomputers, and use almost no power at all.

Recently some progress has been made in this field with Resonant Tunnelling Diodes (RTD), and software modelling that simulates quantum processing. Needless to say, this all very hi-tech and the applications are mind blowing. Computers will be everywhere, some of them as small as dust or embedded within our blood cells to keep check on our vitals. Near instantaneous transfer of information will exist around the planet. The applications are endless.

So when will this all happen? Estimates of quantum hardware of this type being in commercial production range from 10-30 years. But already MRAM (Magnetoresistive Random Access Memory), RTD’s and other quantum applications are already in the market or in development. So it seems just a matter of time.

The other promising replacement for silicon technology is DNA Computing which uses DNA, biochemistry and molecular biology. It was first demonstrated as a concept by Leonard Adleman of the SoCal (University of Southern California), in 1994. Adleman demonstrated a proof-of-concept use of DNA as a form of computation which solved the seven-point Hamiltonian path problem. He used an oligonucleotide, which is just a really fancy name for a polymer. But if you’ve ever watched an episode of CSI when they take a piece of evidence with a suspect’s DNA and put it in a solution to identify who it belongs to, etc then you are watching one typical use of oligonucleotides, as they are often used to amplify DNA in what is called a polymerase chain reaction. Ok, ok, enough of the technobable…well almost.

What does it all mean? Well DNA computers will operate as molecular computers, or other words very, very small. In respect to capability, a typical desktop computer can execute say 108 operations per second, whereas super computers available today can execute say 1014 operations per second. Well a single strand of DNA could execute say 1020 operations per second, or to put it in perspective, a DNA computer would be more than a thousand times faster than current super computers, while being about a million times more efficient in energy terms than a super computer. Impressive! Oh, and it could store 1 Terabyte of data, on the space we take to store about 1 Kb of data right now.

So in theory, inside a cell inside your body, you could carry a DNA computer capable of more computational power than the world’s most powerful supercomputer. This might be useful combined with nano-technology to enhance our natural immune system response, or even more exotic solutions such as augmenting our natural abilities, improving longevity by correcting cellular reproduction error at the molecular level, etc. Pretty wild…

Moore’s Law – Why computers are increasingly disruptive to industry

In Technology Innovation on October 25, 2009 at 06:57

Excerpt from Chapter 9 – Deep Impact: Technology and Disruptive Innovation

You’ve undoubtedly heard of “Silicon Valley” right? Did you know why it is called Silicon Valley? You might think it is because of all the dot com, 2.0 companies that inhabit this region of California. But you’d be wrong. We have to go much further back to the 1950s to find out the origin of the term. It must have something to do with computer chips, because microchips are made of Silicon…

Well in 1947 a gentleman by the name of William Shockley along with John Bardeen and Walter Brattain, invented the transistor. For this, the three were awarded the Nobel Prize in Physics in 1956. The attempts of Shockley to commercialize the transistor is what led to the formation of a bunch of companies in California specializing in the manufacturing of these components. During the 50s and 60s there was a great deal of speculation in the markets about ‘tronics’ or the ability to capitalize on these ‘new’ technologies and advances.

On April 19th, 1965, Gordon Moore, the co-founder of Intel corporation, published an article in Electronics Magazine entitled “Cramming more components onto Integrated Circuits”. In that article he stated a law on computing power that has remained consistent for more than 40 years, a law that drives technology development today and for the near future.

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year … Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer – Gordon Moore’s prediction in 1965.


The term “Moore’s Law” was reportedly coined in 1970 by the CalTech professor and VLSI pioneer Calvin Mead . Essentially what this meant was that Moore predicted computing power would double every two years. Since 1965, that law has held true and remains the backbone of classical computing platform development. But what this all means it that since 1965 we have been able to predict both the reduction in costs and the improvements in computing capability of microchips, and those predictions have held true.

In reality what does this mean. Let’s put it in perspective. In 1965 the amount of transistors that fitted on an integrated circuit could be counted in tens. In 1971 Intel introduced the 4004 Microprocessor with 2,300 transistors. In 1978 when Intel introduced the 8086 Microprocessor, the IBM PC was effectively born (the first IBM PC used the 8088 chip) – this chip had 29,000 transistors. In 2006 Intel’s Itanium 2 processor carried 1,700,000,000 transistors. What does that mean? Transistors are now so small that more than a million of them could fit on the head of a pin. While all this was happening, the cost of these transistors was also exponentially falling, as per Moore’s prediction.

In real terms this means that a mainframe computer of the 1970s that cost over $1 million, has less computing power than your iPhone has today. It means that the USB memory stick you carry around with you in your pocket would have taken a room full of Hard Disk platters in the 70s. Have you ever watched the movie Apollo 13? Remember they were trying to work out how to fire up the Apollo Guidance Computer without breaking their remaining power allowance? Well that computer, which was at the height of computing technology in the 70s, had around 32k of memory, ran at a clock speed of 1.024 MHz. When the IBM PC XT launched in 1981 it was already about 8 times faster than the Apollo computer. The next generation of smartphone we will be using in the next 2-3 years will have 1 Ghz processor chips. That is roughly 1 million times faster than the Apollo Guidance Computer…

These numbers are so mind blowing that if we apply it to the world outside computing things get a little bizarre. For example, if a house shrunk at the same pace transistors have, you would not be able to see a house without a microscope. In 1978 a commercial flight between New York and Paris cost around US$900 and took 7 hours to complete. If Moore’s law applied to aviation in the same way as computing, then that flight today would cost about 1 cent (a penny) and would take less than a second.

Now you know why your technology budget is the way it is…

Customer Value Innovation

In Technology Innovation on October 22, 2009 at 13:23

Excerpt from Chapter 2 – Measuring the Customer Experience

An emerging field in customer experience and behavioural research and marketing is the area of value innovation. Value innovation, in strategic terms, is the creation of a superior customer value with a view to gaining a competitive advantage and/or rejuvenating the institution and organization (Berghman & Matthyssens, 2006). Whereas organizations like Google thrive on constant innovation, traditional organizations like established financial institutions find such more difficult to manage.

Innovation of the Customer Experience, however, is no longer a choice, but a necessity, and more importantly a competitive weapon. As customer behaviour continues to involve more and more rapidly, a culture of continuous improvement of the customer experience is required. This culture of innovation can then influence the entire institution from management through to frontline staff.
This notion of value innovation goes beyond changes in product, process and services and includes:

• New ways of servicing customers/clients,
• New ways of offering value propositions to customers,
• New ways of collaborating with customers,
• New ways of working, and
• New ways of networking competencies and resources

To effectively measure innovation in the institution, we need to measure how innovation is conceptualized, how well the innovation is oriented toward the customer (instead of solely internally), the approach that will be used to roll-out the innovation or proof-of-concept, how we are learning as an organization, and what research (R&D) is being carried out.

Achieving Customer Value Innovation

Achieving Customer Value Innovation

A number of institutions have recently appointed a Head of Innovation or Chief Innovation Officer to set out to accomplish this task. However, the major issue currently facing the institution is we simply don’t measure collectively the experience of the customer, thus any attempts at innovation are likely to be met with frustration because the organization does not understand the competitive benefit likely to be gained.

A good summary of value innovation is defined as:

…the result of the creative efforts of individuals directed by managerial methods that stimulate and foster a culture of learning and creativity within the organization. (Colurcio & Mele, 2008)

Here is the conflict. The more we enforce ‘traditional’ process within an institution, the less likely we are to generate creative solutions to customer issues and problems. The common answer being “sorry the bank does not allow us to do that Sir!” Until we value the ability of our staff to innovate around the customer and support such innovation, then the culture of punishing a staff member for stepping outside of the ‘normal’ procedure is likely to hinder progress in this area.

Why not try Google’s approach? Google offer their staff 20% of their working week to work on innovation and improvement of the business. They can work on any project or assignment they like, and this often results in various project and work groups being spontaneously created by likeminded colleagues with an ‘idea’ on how to do something better.

Why not allow every department and product team to create an innovation role? Give the Chief Innovation Officer a dotted-line of oversight of these resources in addition to the department leads, and let them go for it!

Creating an Innovation Network

Creating an Innovation Network

Thanks for listening

From ATM, Call Centre and Web stats to iPhone Banking App

In Mobile Banking on October 18, 2009 at 04:48

Excerpt from BANK 2.0 – Chapter 6: Mobile Banking

Here is a snapshot globally of typical top-5 monthly requests or active transaction demand via ATM, Call Centre and Internet Banking. This is taken from a series of research projects, analytics, customer focus groups and surveys . But from market to market the trends are generally consistent with some local variations around specific bill payment types or similar. The data can easily be verified for your institution by checking for transaction usage data on the ATM, Call Centre and IVR transaction statistics, and total web analytics (probably through web trends or similar). The Top-5 for each channel are:

Top 5 transaction types globally for ATM, Call Centre and Web respectively (source:UserStrategy.com)

Top 5 transaction types globally for ATM, Call Centre and Web respectively

So above is actually the perfect base functionality for an iPhone or Android Application for ALL of your customers. The only one of these we can’t do is the cash withdrawal function, although mobile payments may make that increasingly redundant.

There are really two classifications of bank-enabled functionality embedded here in an application. The first classification is transactional, the second is content. Account Balance, Credit Card Balance, Recent Transactions, Loyalty program miles/points balance and Bill Payments content can be data that is streamed to a customer’s phone. In fact customers may even pay for this as a service. Functionality such as Transfers, initiating a Bill Payment or Credit Card Payment, need interaction through a transactional platform.

The clear proposition here is that the customer registers his application through the bank or through the app store and puts in an initial level of authentication into the application so that the streaming data can be delivered or refreshed each time the app fires up. If the account information shown is limited to the last four digits of the account number, there is hardly a risk of abuse. Even if the phone is lost, the streamed data is just informational with no content that is open to abuse through fraud, etc.

The additional transactions such as transfers could be confirmed with a secure login or authentication each time, just as with Internet Banking.